It's May again. The flowers are growing, the birds are singing, and I’m getting ready to comment on my last stack of student papers of the term. When I finish, I’ll assign my students their grades. I’d love to be able to skip that last task and wish them all good luck, so it was with great interest that I read about Professor Cathy Davidson’s bold experiment with having her students grade one another. Let me say first that I'm all for the experimentation and the creative study of learning that Davidson is doing at Duke University, and I’ve long been interested in innovative teaching by Davidson’s former colleague Jane Tompkins (who also tried student self-grading) and research by educators like Alfie Kohn, who argues that competition interferes with the learning process. I admire Davidson’s scholarship, and I’ll look forward to her findings.
But Davidson, Kohn, and others can’t increase the number of spots available at medical schools, and they can’t allot a company more job openings than its revenue allows. Those entities depend on professors for our judgment of students, and until we can come up with a different way to apportion limited resources, we have to work within the system we have.
Grading certainly has its problems, and I’ve never met a teacher who enjoyed it. But just as Winston Churchill described democracy as "the worst form of government" except for all the others, so too with grading.
Let me put it more directly. I think avoiding grading (or some comparable form of rigorous evaluation by the instructor) shirks necessary responsibility, avoids necessary comparison, and puts the humanities at even greater risk of bring branded "soft" than they already face.
It doesn’t surprise me that 15 of Davidson’s 16 students signed off on others' work, eventually entitling them to As. Such an outcome brings to mind Garrison Keillor’s description of Lake Wobegon as a community where "all the children are above average."
The bottom line question is this: if everyone gets As, does that mean that Yale Law School will simply accept them all?
If an average class grade is an A, then graduate and professional schools will have to look elsewhere to find out how applicants differ. If I were an admissions officer, the first place I’d look would be to other courses with wider grade distributions, where the instructors rank and compare. Those other courses would weigh more heavily, and the professors who teach them would gain disproportionate influence in the decision process. Put simply, Professor Davidson’s colleagues who grade their students would be helping them more than she would.
Perhaps Davidson plans to make distinctions in the recommendations that she’ll write for the students when they apply for professional schools and jobs. But isn't that the grading that she was supposed to be avoiding in the first place, now done in secret? Davidson’s practice also fuels grade inflation, which disproportionately harms a college’s best students by devaluing their high marks. We need to be wary of such trends, and many colleges already are. Harvard recently moved to limit the percentage of its students who graduate with honors, which had swollen to a watery seventy-plus percent. Columbia University includes on a student’s transcript the percentage of students who got As in each class that the student took. Dartmouth and McGill are two universities that also contextualize their students’ grades. These elite institutions want to create a basis for discernment.
That discernment is personal, and it starts in each classroom. We need to be able to say to students in effect, "You did good work, but not the best in the class." It’s a way to be fair to the students and allow them to gain from their achievements.
The goal is not, of course, to make the classroom red in tooth and claw. I work harder at creating learning communities for my undergraduate and graduate students than at anything else I do, and it’s been well worth my effort over the years. I know that I have to keep seeking new ways to do this, because I agree with Davidson, Kohn, and others that students learn better when they can share the enterprise with each other.
There’s plenty of value to Davidson’s collaborative experiment, then — but grading is still part of her job, and mine, and all professors’. If we stop doing it, colleges and universities will eventually lose the esteem of the society that funds us. The humanities, already at risk, will be the chin that absorbs the direct hit.
Parents know that our children respect us when we save our highest praise for the achievements that merit it. I’m a big fan of Cathy Davidson’s work, and I’ve taught it to my own students. But abstaining from giving grades to students isn’t one of her better ideas. I say this with all due respect — and discernment. And that’s the same respect and discernment that we owe to the work of our students.
Leonard Cassuto is a professor of English at Fordham University, where he was named Graduate Teacher of the Year in 2009.
The American university, like the nation’s other major social institutions — government, banks, the media, health care — was created for an industrial society. Buffeted by dramatic changes in demography, the economy, technology, and globalization, all these institutions function less well than they once did. In today’s international information economy, they appear to be broken and must be refitted for a world transformed.
At the university, the clash between old and new is manifest in profound differences between institutions of higher education and the students they enroll. Today’s traditional undergraduates, aged 18 to 25, are digital natives. They grew up in a world of computers, Internet, cell phones, MP3 players, and social networking.
They differ from their colleges on matters as fundamental as how they conceive of and utilize physical plant and time. For the most part, universities operate in fixed locales, campuses, and on fixed calendars, semesters and quarters with classes typically set for 50 minutes, three times per week. In contrast, digital natives live in an anytime/anyplace world, operating 24 hours a day, seven days a week, unbounded by physical location.
There is also a mismatch between institutions of higher education and digital natives on the goals and dynamics of education. Universities focus on teaching, the process of education, exposing students to instruction for specific periods of time, typically a semester for a course, and four years of instruction for a bachelor’s degree; digital natives are more concerned with the outcomes of education — learning and the mastery of content, achieved in the manner of games. which is why an online game pro will never boast about how long she was at a certain level, but will talk about the level that has been reached.
Higher education and digital natives also favor different methods of instruction. Universities have historically emphasized passive means of instruction — lectures and books — while digital natives tend to be more active learners, preferring interactive, hands-on methods of learning such as case studies, field study and simulations. The institution gives preference to the most traditional medium, print, while the students favor new media — the Internet and its associated applications.
This is mirrored in a split between professors and students, who approach knowledge in very different ways. Traditional faculty might be described as hunters who search for and generate knowledge to answer questions. Digital natives by contrast are gatherers, who wade through a sea of data available to them online to find the answers to their questions. Faculty are rooted in the disciplines and depth of knowledge, while students think in increasingly interdisciplinary or a-disciplinary ways, with a focus on breadth.
Universities and students also now see students in polar fashion. Higher education focuses on the individual, captured in 1871, by President James Garfield, who famously described the ideal college as Mark Hopkins, the 19th-century president of Williams College, at one end of a log and a student on the other. Today’s digital natives are oriented more toward group learning, multiple “teachers” or learning resources, and social networking, characterized by collaboration and sharing of content. This approach is causing an ethical challenge for universities, which under certain circumstances view collaboration as cheating and content sharing as plagiarism.
These are substantial gaps, complicated by the disparities in the way colleges and digital learners see their roles in education. Higher education is provider-driven in belief and practice. That is, the university, through its faculty, determines the curriculum, the content, the instructional methods, the study materials, and the class schedule. Digital natives tend to be consumer-driven, preferring to choose, if not the curriculum and content they wish to study, then the instructional method by which they learn best, the materials they use to learn, and the schedule by which they choose to study.
So what should be done? First, we need to recognize that this is not the first time colleges and their students have been out of step. In the early 19th century, as the industrial revolution gathered momentum, colleges in the main clung stubbornly to their classical curriculums, rooted in the ancient trivium and quadrivium, and to outmoded methods of instruction. College enrollments actually declined, and numerous institutions closed their doors. Bold colleges like Union, in Schenectady, New York — among the earliest adopters of modern language, science and engineering instruction — boomed in enrollment, topping Yale and Harvard combined.
Today, with college essential in obtaining most well-paying jobs, we will not see higher education enrollments drop. However, tardiness in acting will give impetus to the growth and expansion of alternative higher education — for-profit and nontraditional educational institutions that have been more successful in offering programs better geared to digital learners and their older counterparts.
Second, it is important to ask how much colleges and universities need to change. In 1828, facing industrialization and a Connecticut legislature that disapproved of Yale’s classical curriculum, the Yale faculty responded with a report which asked, in part, whether the college needed to change a lot or a little. This, Yale’s faculty said, was the wrong question. The question to be asked, they argued, was: What is the purpose of a college? This remains the right question today.
What is certain is that higher education needs to change, because students won’t, and the digital revolution is not a passing fad. To be sure, the purposes of the university have not changed. They remain the preservation and advancement of knowledge and the education of our students for humane, productive and satisfying lives in the world in which they will live. The activities of universities will continue to be teaching, research and service.
What must change, however, is the means by which we educate the digital natives who are and will be sitting in our classrooms — employing calendars, locations, pedagogies, and learning materials consistent with ways our students learn most effectively. It means that the curriculum must meet our students where they are, not where we hope they might be or where we are. All education is essentially remedial, teaching students what they do not know. This, for example, is a generation that is stronger in gathering than hunting skills. So let the curriculum begin with breadth and move to depth. Cheating and plagiarism violate the cardinal values of the academy, so let’s make it crystal clear to our students how and why they differ from sharing and collaboration.
It doesn’t make sense anymore to tie education to a common process; a uniform amount of seat time exposed to teaching and a fixed clock is outdated. We all learn at different rates. Each of us even learns different subject matters at different rates. As a consequence, higher education must in the years ahead move away from its emphasis on teaching to learning, from its focus on common processes to common outcomes. With this shift will come the possibility of offering students a variety of ways to achieve those outcomes rooted in the ways they learn best, an approach Alverno College in Milwaukee embraced four decades ago.
This needed transformation of the American university is merely the task of taking a healthy institution and maintaining its vitality. In an information economy, there is no more important social institution than the university in its capacity to fuel our economy, our society and our minds. To accomplish these ends, the university must be rooted simultaneously in our past and our present, with its vision directed toward the future.
Biological theorist Richard Dawkins writes in The Selfish Gene that if we wish "to build a society in which people cooperate generously and unselfishly towards a common good, [we] can expect little help from biological nature … because we are born selfish." Observers of academic scandal and fraudulent scholarship often attest to that. Conversely, economist Jeremy Rifkin believes "human beings are not inherently evil or intrinsically self-centered and materialistic, but are of a very different nature — an empathic one — and that all of the other drives that we have considered to be primary — aggression, violence, selfish behavior, acquisitiveness — are in fact secondary drives that flow from repression or denial of our most basic instinct."
Who is right, at least when it comes to professors?
Certainly, violence and aggression are facts of life on the typical campus, ranging from assaults, hate speech and shootings to gridiron wars ignited by tribal bonfires, beer kegs and primal weekend rituals.
As director of a journalism school at a science-oriented institution, I can attest that the empathic professor not only exists but daily displays the grace, forgiveness and tolerance usually associated with higher callings. Ours is such a calling. Who but the empathic professor, from overworked adjunct to distinguished don, can profess the same tenets of basic chemistry, composition and calculus semester upon semester until seasons blend into one career-long academic calendar, were it not for love of learning and the instilling thereof in others.
Teachers, not politicians, shape generations. It has been so since Socrates and Confucius, and ever will be. (Would that state legislatures remember that when allocating funds!)
Too often, it seems, we report the antics, crimes and shenanigans of the Dawkins educator whose selfish gene believes attaining tenure is an entitlement and filing complaints, a fringe benefit.
Within a typical week, I, as director of 50 teachers, teaching assistants and staff members, witness or experience life-changing empathy. I hear it in the open doors of colleagues advising students, or in the break room celebrating birthdays or milestones, or in the hospital visiting a colleague gravely ill but still grading.
Within that same week, of course, I hear gossip, endure factions at faculty meetings, and get anonymous letters and email. Most of my professors realize my English Ph.D. includes a specialty in textual editing, so I can cipher who sent what. (See “Such Stuff as Footnotes are Made On.")
I’m writing about the empathic professor after a week enduring the Dawkins kind, not so much to remind myself that I am surrounded by kinder colleagues as to approach the topic philosophically so that you, too, might focus as I must on the good rather than the disgruntled in our midst. Is it possible that both Dawkins and Rifkin are right, or wrong, or partially so, or more right on one day but wrong the next, especially in the Ivory Tower? I am not a postmodernist promoting truth as illusion. Rather, I am a media ethicist and communication theorist who writes about the human condition, or the inharmonious duet in our heads conveying contrary instructions about the world and our place in it.
Professors, by and large, believe in the human condition but generally do not dwell on it in their disciplines. Ethicists must. In some ways, the human condition sounds eerily like a cable network of talking heads telling us 24/7 that climate change is a political conspiracy; energy consumption, a corporate one; universal health care, a socialist plot; pandemics, a pharmaceutical one, and so on.
Although few admit it, on most days we are paradoxical creatures who traipse in our encounters listening to cymbals of consciousness and piccolos of conscience. The former tells us, “We come into the world alone, and we leave it alone” while the latter asserts, “What is in me is in you.”
Which can be right?
Reading Inside Higher Ed, or any educational news site, we discern the chromatic scale of aggression, violence, selfish behavior and acquisitiveness and less often, the empathic tonalities of kindness, forgiveness and compassion. For better or worse, mainstream media and blogosphere reflect the human condition, what Wordsworth called the still, sad music of humanity.
As such, we are both homo avarus and homo empathicus. Avarus, Latin for “greed,” dwells in the material world; empathicus, in a more metaphysical one. Our life’s work is that of choral director attempting to harmonize them so that one enlightens the other. When we do, consciousness allows us to see the world as it actually is rather than how we would like it to be; to foresee the impact of our actions before taking them; and to assess consequences of past actions to make informed choices in the future. Only then can we meet the demands of the conscience: that we love and are loved by others; that we have meaningful relationships with others; and that we contribute to community.
In my 2005 book Interpersonal Divide: The Search for Community in a Technological Age, I write that conscience grants us peace when we realize that how we treat others determines our own well being and fulfillment. "Community," I assert, "is founded on that principle, from secular laws to religious morals."
Jeremy Rifkin writes about “empathic consciousness,” an organizing principle in his new book, The Empathic Civilization: The Race to Global Consciousness in a World in Crisis. However, when he states, "The irony is just as we are beginning to glimpse the prospect of global empathic consciousness, we find ourselves close to our own extinction," he easily could be discussing what I avow: the specter of global conscience.
Appropriately, that prospect is found in Article 1 of the United Nations’ Universal Declaration of Human Rights: “All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience [emphasis added] and should act towards one another in a spirit of brotherhood.”
In media and education, we have listened too long to the cymbals of consciousness drowning the piccolos of conscience. The more educators raise consciousness about any number of public ills, the longer we seem to debate, explicate and irritate each other rhetorically rather than conscientiously, and the closer society comes to catastrophe. Overemphasis of consciousness has resulted in the repression of global conscience, our truer nature.
Conscience acts on simple truths. It does not debate whether climate change is fact or fiction; it intuits that burning so much fossil fuel is harmful to health and hemisphere. Consider the rhetoric of consciousness before the BP oil spill in the Gulf of Mexico — offshore drilling is vital to the economy — and compare that now to the awareness that pings within us daily. Neither does conscience associate universal health care with political systems but bodily ones necessary to enjoy freedom, equality and dignity. It knows pandemics occur irregardless of corporate balance books when the balance of nature is disrupted.
As The New York Times reported in 1992, Westerners advocating progress “thought they were nearly invincible, at least as far as infectious disease was concerned. They thought they could interfere with ecosystems, and ship goods and people around the world, with little regard for the effect not only on the balance of nature, but also on their own health.”
That balance of nature is on the agenda again and will be throughout our lifetime and our students’ and their grandchildren's lifetimes. There may not be any lifetimes thereafter unless we as teachers can instruct our charges to harmonize conscience and consciousness so that the duet augurs a new era of ethical awareness of the world and our sustainable place in it.
So I will close by reminding myself as well as others at the end of a trying academic year of slashed budgets, furloughs and firings that the empathic genes of our better natures will prevail. Otherwise the campanile also tolls for us.
Michael Bugeja directs the Greenlee School of Journalism at Iowa State University. His latest book, Vanishing Act: The Erosion of Online Footnotes and Implications for Scholarship in the Digital Age (Litwin Books), is co-authored with Daniela Dimitrova, an Iowa State colleague.
“Do the scientists really know? Will it happen today, will it?”
“Look, look; see for yourself!”
From my fourth-floor office window, I watched my students spring forth from their underground architectural studio to the plaza above, like meerkats spilling out of their dens. They came in twos and threes, cameras swinging from their necks, balancing their models as they surged out of the door, looking up at the sky expectantly.
The sun came out.
It was the color of flaming bronze and it was very large. And the sky around it was a blazing blue tile color. And the jungle burned with sunlight as the children, released from their spell, rushed out, yelling, into the springtime.
Quickly they tilted their models in the fleeting sun, capturing the shadows that they had not seen for several cloudy, rainy days.
In the midst of their running one of the girls wailed.
The girl, standing in the open, held out her hand.
“Oh, look, look,” she said, trembling.
They came slowly to look at her opened palm.
In the center of it, cupped and huge, was a single raindrop.
She began to cry, looking at it.
They glanced quietly at the sky.
A few cold drops fell on their noses and their cheeks and their mouths. The sun faded behind a stir of mist. A wind blew cold around them. They turned and started to walk back toward the underground house, their hands at their sides, their smiles vanishing away.
Then they came back inside, hopped on their laptops (not up the stairs to my office), and begged for a time extension on their assignment.
“I had to watch my brother play football this weekend.”
“Things don’t always go as we plan.”
“The forecaster said…”
I did not respond.
“But this is the day, the scientists predict, they say, they know, the sun…”
They needed sunlight for their assignment, due the next day. They needed to observe and photograph clear shadows on their architectural models, using a sundial to simulate these shadows at various times of the day and year. They’d had two weeks already, the first week and a half of which had been unremittingly sunny.
I waited a while longer. Finally, when the sun still wasn’t forthcoming, I wrote back with some constructive advice. I told them what I would do in their position, had I painted myself into that particular corner — I would use the light from a slide projector, which is less than ideal, but better than nothing. They didn’t like my suggestion. They parsed words like “partial credit” and brought out the predictable “you didn’t say that in class”. They wanted the sun, the real sun, which would redeem them and make everything all right. And at the 11th hour, it came back out.
… they were running and turning their faces up to the sky and feeling the sun on their cheeks like a warm iron; they were taking off their jackets and letting the sun burn their arms.
“Oh, it’s better than the sun lamps, isn’t it?”
“Much, much better!”
Most of them got to see the sun for just enough time to finish the assignment as intended. But I found out later just how alien the sun still was to them, and sadly, to me, though we live on Earth and not in the near-perpetual rain of Venus, like the children in Bradbury’s story.
One of my students, a girl with clear blue eyes and smooth, straight, light brown hair, came to visit me shortly after the first test. She wanted to check which questions she’d gotten wrong, since she’d done so poorly. She was frustrated that she’d focused too much on the wrong things while studying and at first I was at a loss to help her. Finally we came to a moment of enlightenment. She was surprised that I had asked her to be able to figure out where the sun would be in the sky at various times of the day and year. I had expected that she and her peers had internalized something from recording the sun’s position during their sundial exercise. In short, I had expected her to be like Margot, an earth-born girl who knew the sun by heart.
But Margot remembered.
“It’s like a penny,” she said once, eyes closed.
“No it’s not!” the children cried.
“It’s like a fire,” she said, “in the stove.”
“You’re lying, you don’t remember!” cried the children.
My student admitted that she didn’t really understand this business about the sun. While flipping through the appendixes of the textbook looking for sun path diagrams to show her, it was clear that I still didn’t really, either. I still needed to look it up. As I lay in bed that night, I dreamt up a “sun dance” that I would do in class the next week. It was designed to help the students, and me, remember where the sun is in the summer, winter, spring and fall. Because we all know it, but we all forget. Sitting in that oversized, refrigerated auditorium where my lectures are held, there’s no way we could know what the sun is doing. So in the next class, we stood up and danced:
“It’s the winter solstice. Face south. Stretch out your arms, a little forward. Your left fist is the sun, rising above the horizon to the south of east. Lift it up through the southern sky, in front of you. The angle is low; it will reach into the building. Now raise your right hand to meet it at its highest point, and arc back down to the south of west.”
“OK, now it’s the equinox. Reach your arms straight out to the sides. On the equinoxes, the sun rises directly in the east and sets in the west. It’s now higher in the sky.”
“Now it’s the summer solstice. Stretch your left arm behind you. The sun rises north of east, shines on your back, the north face, at a low angle. As it rises to its apex, it’s even higher in the sky; now you can block it with an overhang. As it sets, the north façade receives this low, western sun.”
… they squinted at the sun until tears ran down their faces, they put their hands up to that yellowness and that amazing blueness and they breathed of the fresh, fresh air and listened and listened to the silence which suspended them in a blessed sea of no sound and no motion. They looked at everything and savored everything.
But I learned, months later, that they didn’t appreciate the dancing. They complained about it to my program chair and on my course assessments, saying it was beneath them, that I was talking down to them.
“She belongs in an elementary school classroom.”
“It is unfair to assume that college classes should involve dancing.”
“No,” said Margot, falling back.
They surged about her, caught her up and bore her, protesting, and then pleading, and then crying, back into a tunnel, a room, a closet, where they slammed and locked the door. They stood looking at the door and saw it tremble from her beating and throwing herself against it. They heard her muffled cries.
Once I learned about the students’ objections, I reacted as quickly as I could. In class, I became more subdued, more opaque. I tried to show more and explain less. I stopped dancing.
Spring came, and with it, more chances for us to get out of our windowless classroom and to see firsthand the work of architects and builders who worked with the sun in a far more direct and convincing way than my abstract explanations could ever convey. I learned the hard way, like Margot, that I can’t really describe the sun. The students have to see it for themselves.
On the last day of classes, they evaluated me again.
“Your opinions are important as we make plans for this course in the future. Please be candid about what topics and experiences you felt were useful, and which ones weren’t,” I heard myself say. What I thought was the same thing all new teachers think, “I am trying to teach you in the best way I know how. Please be kind.”
They stood as if someone had driven them, like so many stakes, into the floor. They looked at each other and then looked away. They glanced out at the world that was raining now and raining and raining steadily. They could not meet each other’s glances. Their faces were solemn and pale. They looked at their hands and feet, their faces down.
One of the girls said, “Well…?”
No one moved.
“Go on,” whispered the girl.
I left them there, filling out that one last set of bubbles before they were set free. For me, retreating down the corridor, it was a moment of reckoning; for them, a chore barely restraining them from running out into the May sunshine.
They walked slowly down the hall in the sound of cold rain. They turned through the doorway to the room in the sound of the storm and thunder, lightning on their faces, blue and terrible. They walked over to the closet door slowly and stood by it.
Behind the closet door was only silence.
They unlocked the door, even more slowly, and let Margot out.
Elizabeth Grant is an assistant professor in the College of Architecture and Urban Studies at Virginia Tech.
Teachers and students have always been an important market for Apple — a fact made clear by the tremendous amount of spit and polish that went into the new education website the company recently unveiled. But honestly: What do Apple’s slickly produced promo videos of adorable multicultural elementary schoolers have to do with us? And just how relevant is their newly-released iPad for what we do? Do academics really need to shell out five hundred bucks for what is essentially a big iPod touch?
After having used an iPad shortly since its release I can safely say that the device — or another one like it — deserves to become an important part of the academic’s arsenal of gadgets. Choosing to plop down the money for an iPad is like Ingrid Bergman’s regret over leaving Casablanca with Humphrey Bogart. You will do it: not today, not tomorrow, but soon — and for the rest of your life.
At base the iPad is an anything box that replaces a seemingly endless plethora of other things you already own: It's a TV, a radio, an MP3 player, a compass, a flashlight, a level, a deck of cards, a calculator, a photo album, an alarm clock, a Bible, the Talmud (yes, the Talmud has been ported to the iPad)... the list goes on and on. The crucial question for academics is: What in our current arsenal will the iPad replace? After using the device, the answer surprised me: the iPad makes a lousy computer replacement, but it does a great job of replacing paper.
Let me begin by getting one thing straight: When it comes to weaning professors off of traditional computers, the iPad fails. It is simply not a good device for people who do serious productive work, whether that be reading, writing, or working with multimedia. The iPad’s on-screen keyboard simply cannot hold a candle to an actual keyboard, even for academics who are veteran texters well-versed in the use of autocomplete functions. You could get a keyboard for the iPad… but then you’d be using a netbook.
Apple deserves credit for making the thing as usable as it is, but it is still not quite there. You can browse on it, but you can’t quickly and effectively search databases. You can read e-mail messages, but it takes a tad too long to write them. The screen is much more generously sized than a cell phone… but such a comparison simply damns the iPad with faint praise. Over time the iPad may get more usable as the software improves, but its size will not. And so until the human visual field shrinks and our fingers no longer require tactile feedback, we academics will be sticking to our keyboards and screens.
Where the iPad does shine is as a paper replacement. The iPad is the long, long awaited portable PDF reader that we have hoped for. Finally, we have a device that preserves formatting and displays images, charts, and diagrams. After decades of squinting at minuscule columns of photocopied type we can now zoom in on the articles we are reading and perfectly adjust the text to the width of the screen. You can even highlight and annotate documents and then send the annotations back as notes to your computer.
True, some people do not prefer a backlit screen, but it’s great for reading at night, and despite some early evidence to the contrary, LED screens don’t cause eyestrain any more than eInk. The device is slightly heavier than the Kindle and Nook, but it is still ultra, ultra portable and ultra usable. It makes you read more and saves paper — which is clearly a good thing. Because of the iPad I’m finally untethering myself from paper files. In fact these days I’d rather buy an eBook and export the annotations to my notebook program than add another underlined book to my library — an amazing turnaround for someone who once ranted on this very website about his passion for paper.
The reason the iPad is such a great paper replacement is Apple’s app store. Devices like the Kindle sell you content from a single source and allow you to read it in a single way. The iPad, on the other hand, allows third-party developers to create (and sell) different "apps," or programs, that live on your iPad. This means developers can build better and better apps for reading PDFs, and we can use them without having to buy a new device.
Now, it is currently early days for the iPad and the software is still developing: I have to get my PDFs onto my iPad with one program, and open and read them with another. But clearly things will improve. The makers of the überbibliography program Sente are already working on an iPad app, and soon they and others will make the device even more useful. The only thing you’ll need that can’t be downloaded to the iPad to help you read documents is a stylus — that you’ll have to buy yourself, and trust me, it is actually quite useful, even on a "magically" touchable device like an iPad.
That said, the revolutionary thing about the iPad is not software for reading content, but for finding (and buying) it. The iPad represents the genuine retailization of academic content. Let me explain:
Currently folks like Elsevier act as content wholesalers, selling greats bucketfuls of the stuff to libraries, who then make it available to students and professors. As journals have slowly transitioned away from paper, they have pursued business models of the "purchase this enormous bundle of journals you don’t want or else our Death Star will destroy another planet of your Rebel Alliance" variety. Individual articles are prohibitively expensive, and academics must fight through a tangled, messy mass of proxy sign-ins and authentication web pages while their IT guys make embarrassing, eye-averting administrative decisions to not think too much about the copyright of what is being posted on class Web sites.
Amazon and others have led the way in producing apps that allow you to read content across different devices: once you purchase an ebook or from Amazon you can read it on a Kindle, an iPad, a Mac, or a PC. This in turn raises the question: What would happen if journals went straight to consumers and sold articles like they were mp3s? What if you could log on to your ScienceDirect or JSTOR app and get a complete browsable list of your favorite journal articles, available for purchase for, say, 25 cents each?
Academics are ready for this development. We’ve spent years suffering from Amazon’s fiendish "get drunk and use our one-click purchase feature" to buy books online, and we often download tons of PDFs to make us feel productive. Apps with alerting and micropayment systems could provide for massive distribution that would push new issues of journal to your digital reading device. As such they offer a world where everyone can read exactly the articles they want. Individuals, not institutions, could purchase content — exactly the content they’re like, regardless of whether their library subscribes to it or not. In such a system publishers might object that piracy would be a concern, but honestly: If you’re selling content to universities that license it to tens of thousands of students living in highly-networked dorm rooms, is an app store really going to make the problem worse?
There are plenty of outlandish scenarios to imagine: professors who create specialized current content lists or anthologies of classic or cutting-edge articles, essentially filtering wholesale content and retailing it to increase their academic prestige (or even a chance to dip their beaks). Classrooms where student readers are easy to assemble and cheap — something textbook companies have tried unsuccessfully to do for some time. Librarians free to give up their increasingly restrictive role as purchasing agents and get back to old (and new!) roles of developing collections and enriching their institutions.
A key feature of the retailization of scholarly content is that it be reasonably free of digital rights management -- and here academic publishing should learn from the music industry’s failed attempts to sell copy-protected music. The more open and reusable academic content is, the more reasons people will have to buy it. The great thing about PDFs is that, like MP3s, they are not copy-protected. While some, like the Google book settlement, have sought to meter content down to the word in the name of "choice," such a move will ultimately prove equally stifling. Neither locking down our ability to move texts around nor micrometering them to death are good outcomes for the future of scholarly communication.
As an anything box, the iPad has the potential to replace a whole variety of devices that we use in our research, from voice recorders to GPS units to tuning forks. To be honest, however, I am not sure just how many niches there are here for Apple to fill. The iPad is an expensive device to take to the field, and a lot of times it just cheaper and easier to buy a tuning fork. And in addition, the app store lacks the super-deep selection of specialized programs that are currently available for normal computers.
I'm sure there are certain cases where an iPad might make a great mobile device: photographers who want to view, edit, and upload their photos on the fly, for instance. Overall, however, by splitting the difference between dedicated devices and genuine computers, the iPad doesn’t show a lot of promise as a mobile platform for research and teaching. Of course if everyone is always carrying around an iPad already then they might start replacing voice recorders. It's hard to tell. My bet is that tuning forks and compasses are not going away.
Finally, I’ve been talking about how the iPad helps academics do academe better — but does it offer the ability to do academics differently? Is this device truly "magical" in a way that will radically innovate academe?
While I can imagine some innovative pedagogic uses of the device, what academics do is still narrowly defined — and tied to institutional, political, and economic imperatives. Some imagined the Internet would cause us to rethink what it meant for a text to be coherent — and it has, to a certain extent. But really it has just reinforced our chunky, discrete notions of texts by making it easier to share PDFs and .docs. The academy might be too obdurate to be easily transformable.
At heart, an anything box like the iPad might not be such a dramatic agent for change anyway. The iPad is a chameleon, able to assume the form of other things but lacking (so far) its own unique identity. You can introduce Twitter into the classroom, but Twitter is the innovative factor here, not the iPad. It may be that someone will write the killer app for the iPad that will mutate our activities in unimaginable ways. But for now those ways remain…. unimaginable.
Indeed, it may be that the iPad is just the harbinger of some future tablet device that is yet to come. That future device might not be from Apple, but it will owe a lot to the iPad. Ultimately, academics need a world full of devices they can pour information in and out of. The more open and interoperable our new ecology of applications, devices, and content providers are, the more our learning will enrich human life — whether the people selling us our readers, software, and content are Apple, Amazon, or someone else entirely.
Alex Golub is assistant professor of anthropology at the University of Hawaii at Manoa.
Shakespeare’s Much Ado About Nothing reveals the ways in which malicious and unfounded accusations can destroy lives, friendships, families, and institutions, including academic and military ones. During her wedding ceremony, a bride’s fiancé falsely accused her of prior promiscuity. The fiancé and his lord believed they had seen the evidence of the bride’s infidelity with their own eyes, but the evidence had been cooked by the lord’s bastard brother, who staged a misleading scene to deceive them. Besides destroying the wedding and humiliating the innocent bride, the slander led to dissension within the state and the army. It took a fool who proudly called himself an “ass“ to bring the unfounded accusation to the attention of the authorities, the fiancé, and the lord. They exemplified virtue by acknowledging and repenting their overreaction to the false accusation, thus leading to a happy ending believable only in romantic comedy. All’s well that ends well in comedy, so in this case the false accusation was indeed much ado about nothing.
But that is not always the case. Scott Jaschik’s Inside Higher Ed article, “YouTube and Context,” makes clear I was falsely accused of advocating rape in a lecture I gave on Joseph Conrad and Nicollo Machiavelli at the annual ethics conference at the U.S. Naval War College this past May. The accusation occurred via the Internet on YouTube. A sound and video bite of a little over three minutes from my lecture was posted under the headline, “Naval War College Professor Advocates Rape.” Within a few days, over two thousand viewers saw the clip, which soon attracted the attention of the Pentagon and Congress.
The only problem is that I never advocated rape, which would be crazy in any forum, especially an academic one, and most especially a military one. When an accusation sounds too crazy to believe, look again. Gender-related sensitivities in the American military going at least as far back as the infamous Navy pilots’ Tailhook groping scandal make leaders extremely careful to avoid giving offense to anyone. And indeed, I was not speaking in my own name. Instead, as revealed in the full transcript of my remarks, I was revealing why Machiavelli deserves his infamy as a “teacher of evil” because he did indeed advocate the rape not of women, but of the peoples and countries his ideal leader would subjugate. Hence the title of one of the most insightful books on Machiavelli today, Machiavelli’s Rapacious Republicanism, by an old acquaintance of mine from graduate school, the brilliant Austrian scholar, Markus Fischer.
Interpretation is not advocacy. I was interpreting Machiavelli, not advocating Machiavellianism. The person who posted the clip either did not know the difference, in which case he or she was not prepared intellectually for the thoughtful discussions of any academic institution, or did not care, in which case the individual defamed not merely me, but also my institution by deliberately taking my words out of context. As one of my senior colleagues has remarked, the YouTube post was "an act of cyberterrorism not merely against Karl Walling, but the War College itself."
Such libels are bound to be increasingly common in the YouTube age and a threat to any professor in the classroom. Any one of us could be next. How can we speak freely if we must fear that any student might post distortions of our remarks on the Internet? Can we allow video vigilantes to incite mobs in the university? Can administrators be intimidated by the vigilantes and still retain the trust and respect of faculty? Don't forget that a significant portion of world opinion believes that the lamentable events of 11 September 2001 were the result of a conspiracy in the Bush administration, or Israel, or any of a number of the usual scapegoats on libelous Internet websites, not the work of Al Qaeda. This despite the fact that Al Qaeda has claimed credit for the attack! How can we prevent the cyberterrorists from winning?
Because this is the first time my institution has had to deal with this rising threat to any academic institution, it made several rookie mistakes in handling it, but it should be those mistakes, not the individuals who made them or the institution itself, that are the issue now. My institution may be the most intellectually happening place in the American military, but we are all rookies with Internet libel. We have a common enemy in those who would attack the academy with the Internet equivalent of scribblings on bathroom walls. What can academic institutions do to prevent such mistakes in the future?
Both common sense and common courtesy would dictate informing a professor about a potentially scandalous Internet clip from his or her lecture, seminar, or other professional work, and asking for an explanation before demanding an apology or taking disciplinary action. Especially in light of the Shirley Sherrod incident, in which a conservative blogger defamed a member of the Obama administration by deliberately posting a clip from her remarks that made her seem to say the opposite of what she intended and actually said, prudence would dictate a careful investigation of the facts, including a transcript when it is available, before making hasty judgments.
Much against my own judgment, under heavy pressure, and before I saw the YouTube clip, I did issue a tepid apology, the gist of which was blame Machiavelli, not me. He after all was the one who used rape as a metaphor for leadership. Discerning members of the audience understood this, but this sensationalist farce acquired an unstoppable momentum of its own. That YouTube, since the publication of Scott Jaschik’s article, but also perhaps through requests from my institution, has withdrawn the libelous video from its site is no great consolation. The post generated at least a dozen other articles and two television stories. The effects of this false accusation will endure as long as they remain on the Internet and are unrefuted. Hence, when the facts are finally known, when they reveal the accuser has distorted a professor’s words to make him or her appear to say exactly the opposite of what the professor intended, and actually said, make the facts known widely and publicly. Just do the right thing, as the Obama administration did when it acknowledged Shirley Sherrod had been defamed.
Shirley Sherrod knows the name of her accuser, whom she reportedly intends to sue. My accuser used an anonymous e-mail address. My institution apparently has no conclusive evidence to identity him or her yet, and may never acquire it, so some thought needs to be given to how to deter libel when anonymous e-mail addresses may make posters unaccountable.
As often happens in moments of hysteria, it is sometimes tempting to blame the victim. I used the word “bitch” twice in my remarks: once in depicting the mindset of a rapist; the other time in portraying the victim’s likely attitude toward her rapist. So I was reprimanded for using offensive language, though it is not my words, but Machiavelli’s view of leadership that is truly offensive. Rape is a common metaphor for conquest and tyranny. As revealed in Chapter 25 of The Prince, in one of the most famous passages in Renaissance literature and philosophy, Machiavelli used the metaphor of the rape of poor Fortuna to reduce politics to war and war to crime. The word hubris, often translated as overweening pride, that is a common theme not merely of tragedy, but also of strategy, stems from a Greek word for rape, with hubristic characters depicted as having lost all sense of limits. Machiavelli challenged the philosophy and religion of his time by questioning whether there can be any ethical limits to strategic thought and action. Unless conferences on professional military ethics are to be mere Sunday school exercises, that question deserves serious attention from those engaged in unconventional wars, in which the customary limits of war come frequently into dispute. What better way to reveal what is most shocking in Machiavelli than to use language that approaches the limit of what is considered acceptable in our time?
It would take the comic genius of Tom Wolfe to explain how my critique of Machiavelli was twisted into the advocacy of the very crime for which I was indicting him. Not merely feminists (who can easily find at least a hundred articles on Machiavelli and feminism with a quick web search), but all decent minds should turn their anger on Machiavelli, not me, while recognizing that he was also a political and military genius, the sort both insurgents and counterinsurgents, terrorists and counterterrorists have much to learn from today. With the United States bogged down in two counterinsurgencies in Iraq and Afghanistan, understanding Machiavelli could prove very useful, if only for learning to think like our worst enemies. How can we learn from evil geniuses without becoming like our own worst enemies? That was one of the big questions of my lecture. That it was obscured by a reckless vigilante is a terrible, terrible pity.
It will take careful thought to save academics from this sort of outrage in the future. It will require a mix of technological, ethical, and institutional fixes. I do not believe it is possible any longer for individuals at my institution to post clips of lectures from its video archives without permission. So there is now a gatekeeper, though perhaps at the regrettable price that recordings of important lectures will be less freely available in the future. Whether gatekeepers are worth this price needs to be examined carefully. It may depend on circumstances.
Since anyone with a cell phone could commit the same offense, technological fixes of institutionally-controlled Internet systems will certainly not be enough. The most unsung heroes of colleges and universities are those who teach English composition. Just as they do (or should) teach rules of evidence for written citations, so too ought they teach students to apply those same rules to video citations, with students warned that plagiarism, deliberate distortions, misleading quotations, and the like are not merely unethical, but may also put them in serious legal jeopardy. My institution does not have a faculty senate, but it surely needed one in this instance to slow down the rush to judgment. Institutions that already have faculty senates might assign Internet libel cases to committees within them, which would serve both the dignity of those institutions and the rights of the accused by providing some form of due process.
And one other thing. Professors teaching Shakespeare might use Much Ado About Nothing to get students to think about why libel is a serious problem, which will help them understand why the thoughtfulness induced by careful reading of old books is relevant to our so-called information age, and perhaps our only salvation from the snap judgments that age frequently induces. Such thoughtfulness is the aim of my teaching, which, with a little drama now and then, has helped me turn on more than a few light bulbs. It would be a crime to let the cyberterrorists turn out the lights of the academy.
Karl Walling is a professor of strategy at the United States Naval War College.
William Buckley famously said he’d “rather be governed by the first 2,000 names in the Boston phonebook than by the dons of Harvard.” In my 14 years as president of a leading liberal arts college, I grew weary of overworked jokes that likened leading a faculty to herding cats or kangaroos. Looking back, I recognize in them a bit of bravado masking an awkward misalignment. Faculty are proudly autonomous, defiantly so, independent thinkers who give each other as much trouble as they give the administration when one or another of them raises a head above the herd in a gesture of leadership. Faculty are socialized as individuals, not as members of a group; taking a broader view runs against the grain for many of them, in the ways and for the reasons Hugh Heclo enumerates in his insightful book, On Thinking Institutionally. And yet the principle of “shared governance” requires a faculty capable of effective self-governance in partnership with professional administrators and a voluntary governing board.
The institution I was privileged to lead and others with which I’ve been affiliated have wonderful faculty – exceptionally engaged, responsible, and responsive in virtually every respect. Yet from the day I arrived on campus as a new president, I was schooled in a cultural norm that the better part of valor was to tiptoe around the faculty. It was as though "the faculty" as a whole was a hibernating bear no one dared disturb for fear of being mauled. I could see all the ways in which the faculty as a body – a "constituency" in academic parlance – was being watched, coddled, and handled with enormous investments of energy and studied restraint. Over time, as I became adept at reading the emotional force fields on campus, I realized that this strenuous effort was thinly masking an undercurrent of fear. And this, I have come to learn, is true to one degree or another through much of the academy.
The fear arises out of an intellectual culture that is awash in competition and critique, in picking ideas apart and taking no prisoners. Critical thinking and skepticism are the coins of the realm. But skepticism can devolve to cynicism, and criticism to contempt, an acrid brew of belligerence and disengagement that can poison morale and yield a system of self-governance far better suited to obstruction than construction. This is a pity because it matters, both educationally and strategically.
Educationally, students pay close attention to how the "grown ups" on campus behave. The academy remains arguably one of the last major sectors in American society still making a good-faith effort to both uphold and enact the view that in a healthy democracy we have obligations to one another. This includes the obligation to resolve differences by enabling the majority to form its collective judgment through meaningful discourse in which all relevant positions are fully aired. "A democracy needs citizens who can think for themselves rather than simply deferring to authority, who can reason together about their choices rather than just trading claims and counterclaims," Martha Nussbaum wrote in Cultivating Humanity.
Strategically, faculty governance bodies have pressing work to do in this era of shrinking resources and accelerating global competition. If they once routinely fostered authentic and serious public debate about real educational problems, discussion too often deteriorates, now, into something even less informative than a clash of competing claims, a spectacle more akin to disconnected “serial oratory.” At my own institution, and others I knew well, it was mystifying to see faculty members we revered for their pedagogic virtuosity – faculty who were creating in their private classrooms exquisitely hospitable venues for courageous exploration of controversial ideas – so stuck in old and unsatisfying habits when trying to resolve conflicts in the academic calendar, or come to terms with grade inflation, or revise the curriculum.
These discussions moved painfully slowly and unpredictably. Often a lone, loud voice or a mobilized minority faction would hijack the conversation in the eleventh hour. I couldn’t help but wonder, at these times, whether this would be happening if the faculty as a whole were more vividly experiencing itself weighing evidence and making wise choices on matters of curricular or educational consequence and then feeling bound to one another by their collective decisions.
Many faculty are increasingly conscious of imbalances within their own ranks, frustrations they discuss privately with deans or presidents hoping for a simple solution from on high. Rarely do they come together to explore their mutual accountabilities: to one another, to their departments, to their disciplines, and to students other than those they see directly in their own classes, offices, studios or labs. Some carry a disproportionate load for their institution as a whole, while others seem to ride more or less free. Disparities of this kind seem to be widening.
When one or another faculty member would bring an injustice or a dispute to the administration for adjudication, I often felt tempted to weigh in with what looked like decisiveness. I learned, though, that only the faculty had the power to resolve differences among themselves. The impulse that flows from perceived inequities is to tighten central controls. But that only exacerbates the problem. People who feel under surveillance resist authority, or withdraw, or both, feeding a vicious cycle: more controls, less commitment. Rather than acquiesce in the imposition of more central controls, faculty themselves would do well to shore up their own systems of citizenship, taking account of the increasing complexity of faculty work, while recognizing that the institution’s continued success will require ever greater interdependence.
In some schools, the economic downturn has brought faculty into new relationships with the administration and the trustees on budgetary decision-making, strengthening their roles in shared governance, at least for a time; in others, the reverse has occurred. As financial and competitive pressures continue to bear down on all institutions of higher learning, the incremental changes many have been making to ride out the recession – draining reserve accounts, deferring maintenance, making across-the-board budget reductions, reducing staff, relying more on contingent faculty – are likely to shift more work onto faculty shoulders and erode the quality of their work lives. If budgets have to be trimmed further, it’s hard to imagine finding additional economies without reconsidering the organization of the educational enterprise itself and the assumptions behind it: how students learn, how faculty teach, the nature of the curriculum, how everyone uses time.
I worry that the professoriate may be standing at the threshold of a shake-down as disruptive as was the restructuring of medical work that began in the 1970s when health care costs began to spiral out of control, the process that Paul Starr analyzed with such foresight in The Social Transformation of American Medicine. And I worry that colleges and universities with strong faculty – brilliant scholars, devoted teachers, radical individualists, and stubborn skeptics who treasure autonomy, resist authority, distrust power, and who love their institutions as they have known them – may find it especially difficult to bring faculty together, bring departments together and make timely, wise, informed and realistic choices about a future worth having.
Over the next decade, colleges and universities are likely to need greater flexibility, organizational resilience and openness to new ideas, and, at the same time, stronger internal systems of shared responsibility, accountability, collaboration and communication. They will need to become more fluid learning organizations, better positioned to capitalize on the forces of change, and better able to make and defend potentially divisive choices, while remaining true to the purposes that will ensure continued success.
Faculty will need to be clearer about those purposes and about the essential ingredients of the education they want their students to expect and receive – an integrative education that prepares new generations to take their places in a world of mounting complexity, interdependency, inequality ... urgency. They will need to do a better job of modeling the serious engagement of their own differences that integrative learning clearly implies and that enlightened organizational stewardship absolutely necessitates.
Diana Chapman Walsh
Diana Chapman Walsh served as president of Wellesley College from 1993 to 2007.
The press and the blogosphere have devoted significant coverage recently to a report by the Georgetown University Center on Education and the Workforce that predicted that the United States is on "collision course with the future." The report estimated that within a mere eight years, the nation will suffer a shortfall of at least 3 million workers with college degrees and 4.7 million workers with postsecondary certificates. The authors of the report concluded that to meet the challenges of a global economy in which 59 to 63 percent of domestic jobs require education beyond the high-school level, America’s colleges and universities "need to increase the number of degrees they confer by 10 percent annually, a tall order."
Although numerous commentators have responded to the report by echoing its call for increased access to higher education, it seems to me that few have focused on a key term in the report’s call to "develop reforms that result in both cost-efficient and high quality postsecondary education." Producing millions more baccalaureate-educated workers will do nothing to address the competitiveness of the U.S. workforce if those degrees are not high quality ones. Sadly, it is pretty clear that far too many college degrees aren’t worth the paper on which they are printed.
In 2006, the Spellings Commission reported disturbing data that more than 60 percent of college graduates were not proficient in prose, document, and quantitative literacy. In other words, significantly more than half of college degree holders in the United States lack the “critical thinking, writing and problem-solving skills needed in today’s workplaces.”
Robert Atkinson, president of the Information Technology and Innovation Foundation, cited these findings in his recent Huffington Post essay, "The Failure of American Higher Education." He shared stories about recent college graduates, many from prestigious universities, who had applied for jobs at his think tank who were unable to complete basic tasks such as summarizing a person’s credentials into a short biographical sketch or calculating an average using a spreadsheet. Atkinson argues that one of the primary reasons for the inability of so many college graduates to think, write, speak, argue, research, or compute proficiently is that colleges “are focused on teaching kids content, not on teaching them skills.” His explanation for this is that members of the professoriate are not interested in teaching these important skills, but rather are interested in exploring the content of the subject matter in which they specialize. Atkinson then advocates several "solutions" to his perception of the problem, which include a requirement that all college graduates take a national test to measure skills competencies and “radical experimentation” in college design that focuses “on teaching 21st century skills, not 20th century subjects.” These ideas are typical of the well-intentioned but misinformed suggestions that abound these days about higher education.
The commentators are correct that there is a mismatch between what faculty members are doing and could be doing to teach students. But the problem isn't a lack of faculty interest in students, but a broader set of staggering challenges facing professors – challenges that deserve more attention.
First, college and university faculty members often lack the ability to teach basic reading, writing, and math skills. Why? Because most professors are not trained to do so. With few exceptions, doctoral programs focus on teaching disciplinary content and methods of inquiry, not pedagogy. Even in universities that provide their doctoral students with a "preparing future faculty" program to help Ph.D. candidates develop some teaching skills, such programs focus on teaching and learning at the college level, not on basic reading comprehension, the fundamentals of composition, or elementary quantitative skills. The K-12 educational system is supposed to teach these abilities. By the time students get to college, faculty members rightfully expect that they will already know how to calculate an average or summarize the main points of a newspaper article, a book chapter, or a journal article. Accordingly, faculty members see their role as then honing students’ critical thinking abilities within the context of analyzing, synthesizing, and evaluating information, often within a disciplinary framework.
These assumptions were fair ones once upon a time. Sadly, though, far too many students who have earned a high school diploma are unable to meet such expectations. Absent a handful of specialists in English departments, most college faculty members are simply ill-equipped to know how to teach students how to begin writing coherently. Professors expect to provide students with feedback on writing more efficiently and persuasively, not teach about tenses, subject-verb agreement, or basic punctuation. Yet, these are types of problems with which faculty routinely try to cope, at least for a while. And that leads to my second point.
Given the woefully inadequate preparedness of high school graduates to engage in college-level work, many professors quickly become burned out attempting to teach skills that they never expected they would need to teach at the postsecondary level. I have heard dozens of colleagues from across the country at different types of institutions of higher education say, "I didn’t earn a Ph.D. to teach what should have been taught in elementary and high school." Many such instructors give up; rather than teaching the skills that should have been learned before students arrive in college, they focus on content because it’s easier to do so. There is only so much that can be done over the course of a college quarter or semester. Worse yet, they fear holding students to high standards for a myriad of reasons, which is the third problem I wish to discuss.
College faculty members, especially those who are untenured, often fear setting course expectations too high, challenging students’ comfort levels too much, or being rigorous in their assessments of student performance. If students perceive a professor as being too hard, they will avoid that person's classes, which can lead to under-subscribed classes being canceled. Full-time faculty whose courses are canceled may be reassigned to less desirable duties; part-time faculty members whose classes are canceled often find themselves without any courses to teach. In addition, students often "punish" faculty members they perceive as being too demanding by evaluating them poorly at the end of a course. Because low student evaluations can lead to both tenure-track and adjunct faculty being fired, untenured professors may keep workloads at levels that students perceive to be reasonable and assess their performance more generously than may be actually deserved. Much has been written on this phenomenon as one of the leading factors contributing to the nationwide problem of grade inflation, the fourth issue I will address.
In one of the most comprehensive studies of college grading practices, Stuart Rojstaczer and Christopher Healy documented that the average grade point average at U.S. colleges and universities rose from 2.35 in the 1930s, to 2.52 in the 1950s when a bifurcating trend in public and private institutions emerged. After sharp increases in the 1970s and 1980s, GPAs currently average an astonishing 3.00 and 3.30 at public and private schools, respectively. This trend could be explained by better students achieving at ever-higher levels. But, as discussed above, that is simply not the case when more than 60 percent of college graduates are not proficient in basic reading, writing, and math. Rojstaczer and Healy contend that grade inflation surged in the 1980s with “the emergence of a consumer-based culture in higher education.” And the growth of the for-profit sector of higher education has only compounded this problem in higher education since corporate-based education is built upon the faulty premise of delivering a product (an "education" or a "degree") to paying consumers (what we used to call "students").
Professors who resist the pressures of grade inflation find themselves in the position of having to defend their rigorous teaching in a variety of forums, ranging from resolving complaints lodged against them with their department chairs to participating in pseudo-adversarial grade appeals proceedings and formal grievance hearings. Contemporary college students hold intense senses of consumer-based entitlement in which they see the default grade as an “A.” Recently, I defended a professor who had awarded a “D” to a student who, by my assessment, should have failed the course. During the heated discussion, the complaining student obnoxiously referred to the professor as “incompetent” and “unrealistic.” At one point, she said, “I pay your salaries!” I replied to her, “Then we want a raise for having to deal with snotty, entitled brats like you.”
Notably, the professor involved in this grade dispute was a tenured member of the faculty. For the reasons summarized above, untenured faculty (who comprise more than 70 percent of college instructors nationwide) may have caved in to the student’s demands and changed the student’s grade to avoid a confrontation in which the department chair became involved. But even when faculty members stand their ground, administrators often cave in to student demands because they are concerned with retention rates, time-to-degree completion statistics, complaints from helicopter parents (some of which escalate into lawsuits), and angry students who may turn into alumni who want nothing to do with their alma maters instead of happy alumni who become donors.
The recent case of Professor Dominique Homberger illustrates how college and university administrators contribute to grade inflation. The dean of her college recently removed Homberger from teaching an introductory biology course at Louisiana State University at Baton Rouge in the middle of semester after students complained about her harsh grading on the first exam in the course, even though grades on subsequent quizzes and exams were higher (students appear to have gotten the message that they really needed to up their levels of performance).
What do we do about the sad state of affairs in higher education? There are changes we could make at the college level that could go a long way in improving the quality of higher education. First, no one should be able to earn a Ph.D. and secure a faculty position in an institution of higher education who has not taken graduate-level courses that prepare them to teach effectively at the college level. Graduate education must provide the next generation of college instructors the pedagogical toolkit to be more effective teachers, as well as more effective assessors of student learning. This is especially important with regard to teaching prose, information, and quantitative literacy.
Second, professors who rely exclusively on textbooks must change their ways. Of course, there are many fine textbooks out there, but no college course should rely on a textbook exclusively. Primary source materials from scholarly books and peer-reviewed journals, as well as material from popular culture media (newspapers, magazines, blogs, films, television shows, etc.), when applicable, should be assigned to complement textbook readings. But even more importantly, professors must jettison the “supplements” provided by textbook publishers. Today, many textbooks come with canned lecture notes, study guides, exams, PowerPoint presentations, and other supplementary materials designed to make professors’ lives easier. With few exceptions, most of these materials are targeted at the lowest common denominator.
For example, canned PowerPoint presentations and study guides boil down the information in a textbook chapter to a series of bullet points. But “test bank” questions are the worst offenders. These question focus exclusively on content and are targeted at low levels of cognitive achievement in Bloom’s taxonomy of learning domains: mere recall of data or information. These assessments do not provide any basis for professors to test students’ ability to analyze, synthesize, or evaluate information in a manner that demonstrates critical thinking, writing, or problem-solving abilities.
Third, we must get serious about confronting grade inflation. College professors are not just teachers; they also should be serving as gatekeepers as generations of professors did in the past by awarding grades commensurate with student performance. For this to occur, the consumer-based culture that pervades higher education must be changed. Professors, parents, and administrators must stop coddling students. If a student is not performing satisfactorily, then college instructors must be able to award “D”s or “F”s without worrying about whether doing so will cost them their jobs. Moreover, faculty rewards policies (e.g., reappointment, tenure, promotion, merit raises, etc.) must be changed to reward professors who teach and grade with rigor.
Such assessments must focus not just on the content of professors’ courses, but also on how they develop critical thinking, writing, reasoning, and problem-solving skills. Conversely, professors who give away high grades that are not actually earned by students should not be retained. This is not to say, however, that only those professors who award As to 10 percent or fewer of their students are necessarily effective teachers. Rather, we need to develop better ways of assessing a college instructor’s performance than student evaluations and grade distributions. Reappointment, tenure, and promotion decisions should be based on holistic assessments which include qualitative evaluations by several peers who have observed the instructor teach and on teaching portfolios containing exams, writing assignments, grading rubrics, cooperative learning exercises, and the like. Rigor and transparency should be rewarded.
Finally, to effectively combat both grade inflation and a consumer-based culture in the college student–professor dynamic, politicians, accrediting bodies, and senior administrators must stop worrying about graduation rates and time-to-degree-completion. These artificial metrics miss the mark. The obsessive focus on what percentage of students graduate in four or six years only reinforces grade inflation and a consumer-based culture in higher education. If it takes a student eight years to graduate because professors actually hold that student to high levels of achievement before certifying that student as worthy of a degree, so be it! That, at least, would help to restore the value of a college degree rather than perpetuating the disturbing trend of the past few decades in which the value of the baccalaureate degree has deservedly diminished.
Henry F. Fradella
Henry F. Fradella is professor and chair of criminal justice at California State University at Long Beach .
I’d like to nominate the term "faculty-driven" as a candidate for disinvestment and elimination.
After serving as a director of composition and as the coordinator of a general education program at universities in the Midwest, I am beginning my second year as department head here at a university in West Texas. At our first all-faculty meeting of the year, it was announced that we are on the verge of two major academic initiatives that will require a substantial commitment of institutional time and energy. The first is a program review process, and the second is a quality enhancement project required by our accreditor.
Both of these efforts are necessary and, I suspect, will result in needed improvements. I have faith in the best practices upon which we will model our efforts. I also believe in the goodwill and good intentions of our academic leadership.
I just wish our administrative team would stop saying these efforts will be "faculty-driven." It’s a term that has little, if any, persuasive power. It may in fact, for many faculty, have the opposite effect. Rather than sweetening the pot, it may just as likely leave a bitter taste.
It's possible academic leaders inside higher ed believe that "faculty-driven" is synonymous with “shared governance” (another term I’d nominate for the trash bin) or "grassroots consensus-building." However, these terms only mask how power actually circulates in academe.
Let me be clear: I’m not opposed to how power functions in colleges and universities. I think the decision-making hierarchy in my university is appropriate and benefits faculty, staff, and students. And it’s quite evident who has authority and who doesn’t. Our operational policies tell the story of power and process quite well, and our organizational chart clearly illustrates the verticality of institutional authority.
I'm only saying that academic leaders should be more careful about how they talk about what we do. Faculty don’t drive processes that come down from accrediting agencies or the administration. They execute them. That’s why a better term is "faculty-executed."
Faculty members are directed to execute program review. Faculty are directed to execute a quality enhancement process. Faculty are directed to develop and assess student learning outcomes. Faculty are directed to appear at convocation and commencement. It may be that faculty don’t like to follow and execute these directives, but that’s really beside the point .
Let me share with you to the last two stanzas of a poem I like very much by William Stafford called “A Ritual to Read to Each Other.”
And so I appeal to a voice, to something shadowy, a remote important region in all who talk: though we could fool each other, we should consider -- lest the parade of our mutual life get lost in the dark.
For it is important that awake people be awake, or a breaking line may discourage them back to sleep; the signals we give -- yes or no, or maybe -- should be clear: the darkness around us is deep.
We may fool each other even when it’s not our intention. So we should take as much care as we can. We should use language that clearly depicts institutional power and its supporting policies and processes. And "faculty-driven" is a smudge.
If faculty drive anything, it’s student learning — the central ritual of any academic enterprise. Certainly, there's many an institutional directive that drive faculty away from that focus. But what drives faculty to distraction even more is language that misses the opportunity to do good work.
My preference is that academic leadership should talk straight about the parade of our mutual life. Leaders should tell faculty what they want faculty to do and, just as importantly, they should tell them why. And tell them often. Not fly-by mission statements and core values on overhead slides. But in public and in person, boots on the ground. Follow me. This way. Here we go.
I would also urge faculty to remind themselves of our larger enterprise more frequently. We can all make the easy case of how overworked we are. But we’re not running backhoes or bouncing along on the back of garbage trucks. We’re lucky enough to work in fields of our own choosing.
From my perspective, program review and enhancement projects offer us the opportunity to make the case for values and valuing, a chance not only to remind ourselves of why we do what we do, but also to remind others — especially those above us in the vertical leap — of our unique and vital contributions to the knowledge pie.
(And we should be beating that drum in our classrooms, too!)
Many faculty think they are undervalued or have no voice in the scheme of things. Why then would we resist a directive to tell our story? Otherwise, the line that threads through all we do may become loose, unravel, or, worse yet, break.
I believe very good and persuasive reasons often exist for why faculty should execute what we are directed to do. But there’s a difference between giving directives and giving directions. The first is a means; the second, an end. But they can be easily confused. And frequently are. When directives become ends in themselves, we lose our way in the dark.
Please take out the directions and read them again.
Laurence Musgrove is a professor and chair of English at Angelo State University, in San Angelo, Texas, where he teaches composition, literature, creative writing, graphic narrative, and visual thinking. His work has previously appeared in Inside Higher Ed, Southern Indiana Review, The Chronicle of Higher Education, Concho River Review and Journal for Expanded Perspectives on Learning. He blogs at www.theillustratedprofessor.com.
I am always working. If not at the office, then at home. And if not in front of a computer, then sitting on the couch with my nose buried in a book or a journal. And if not there, then riding around my yard on the lawnmower, reading the newspaper, or playing golf with friends.
Like most academics, I live the life of the mind, and wherever I go, my mind is there too: sifting through half-baked ideas; ruminating on the latest developments in my field; wondering if my 7-iron will take that tree out of play.
Unfortunately, my wife, Loren, doesn’t buy into this life of the mind thing, at least not completely. Sure, she understands that my job at the local college helps pay the bills, and she also understands that part of the job requires me to come up with ideas and write articles and books. But she has always been suspicious of my definition of work, and whether what I routinely call work should be considered working at all.
Loren has a profoundly materialist view of work. Some might say reductive. For her, work involves actually doing, well, work: something that can be seen and heard. She rejects the proposition that my mind is my office. (Or is it the other way around? My office is my mind? Which one sounds more impressive?) And she thinks the person who came up with that phrase is an idiot.
In graduate school, while writing my dissertation, I tried to convince her that writing should count as work. She agreed that typing on the keyboard, the act of putting words into sentences and paragraphs, counts as work. But she questioned whether the other nonsense I claimed was writing, like surfing the Internet, watching travel shows on TV, sitting in coffee shops, and drinking beer in the afternoon, was actually work. (In my defense, I never once claimed that my principal occupation at the time -- complaining about writing -- was work, even though my buddies, who were writing dissertations as well, assured me that it most definitely was.)
When I got my first job, I did most of my writing at the office. Loren believed that I was working because, well, I was working. I regularly brought home text for her to read and I managed to write a book and nearly two dozen guest columns for newspapers.
But recently, I’ve fallen back into my old habits. Just the other day, about a week before the fall semester began, Loren and I were working at home, she on a do-it-yourself project and me on a writing project. It was slow going that morning, and by about 11:00 a.m. I was ready for a break. I got up from the computer and sat down in the front room to read the newspaper. Loren was coming in and out of the house, taking measurements in the bathroom and cutting drywall in the garage. She passed by a couple of times without comment, but on the fourth trip I heard a low, Marge Simpson-esque grunt.
The sound caught my attention because Loren, like her mother before her, can communicate five or six different meanings with a grunt, depending on the modulation, ranging from mild annoyance to utter dismay. I thought the sound I heard that morning was on the mild end of the spectrum, so I kept on reading the paper.
Twenty minutes later, I went to see Loren’s progress in the bathroom.
“You’re annoying me,” she said before I could say a word, or even poke my head in and take a look around. “When I’m working, you can’t sit and read the paper where I can see you.” (In addition to her materialist view of work, Loren has a strict collaborative view of work as well. If she’s working, I must work. Or at least appear to work.)
I thought about debating her characterization of my morning activity, but quickly realized I could never convince her that reading the paper should count as work. “Okay,” I said, sheepishly returning to my computer.
Reading the paper that morning didn’t officially count as work, at least not in my house, but it did help me get some work done. That short break, and the distraction provided by other peoples’ ideas, helped me think about my project in a new, productive way.
And that’s the odd, surprising, and wonderful thing about the academic life, about the life of the mind. It often involves staring out the window or doing something else for a while -- putting our projects on hold for a couple hours so we can return to them later in the day, after our subconscious minds have had a chance to do a little work on them.
My marriage to a person who questions the life of the mind is actually quite good for me. It keeps me productive -- gotta keep those fingers tapping on the keyboard lest Loren think I’m looking at the Internet -- and it keeps me honest. I no longer confuse my golfing, reading, or Web surfing with actual work.
Tom Moriarty teaches writing and rhetoric at Salisbury University.