Life

New Digital Tools

Novelty is not, as such, a value to me. One look at my wardrobe will confirm this. But when it comes to assessing new digital tools, being resolutely un-with-it may have certain advantages. I am slow to enthusiasm, and keep well away from the cutting edge, for fear of falling off. All that really counts, to my taste, is usefulness – though simplicity has a definite appeal.

With this week’s column, I want to recommend two such tools. They are free and easy to use. And without indulging in tech boosterism, it seems fair to say that they will improve the time you spend online.

Elegance and efficiency are the defining qualities of Readability. The very name is a case in point – it tells you exactly what you are getting.

With the press of a button, Readability transform a page from any Web site – however cluttered or eyestrain-inducing – into something clean and legible. It also puts the text in large print. Although I am sufficiently far-sighted to need reading glasses, I don’t need them when using Readability.

But even a person with 20-20 vision in each eye might find Readability appealing for its aesthetic impact. It wipes out all the distractions (sidebars, ads, comments, and most graphic elements) and leaves you with pure, unadorned text.

Someone with no technological aptitude can install Readability in about five seconds. The learning curve for its use takes not much longer than that. It works in the major browsers: Internet Explorer, Firefox, and Safari. Once installed, it will create either a button in your browser’s toolbar or an “R” icon in the browser’s lower right-hand corner (what people with the lingo call its “system tray”).

When you find a Web page that you’d care to read as unadorned text, click on the Readability button . It promptly transforms the article (or blog post, or what have you) into a document that resembles a typescript in roughly 14- or 16-point characters. Graphics and photos embedded in the articles will remain, but everything else is stripped out.

To return to the original version of the article, either hit the browser’s “back” button or click the faint back arrow that floats in the upper left corner of the Readability screen. Another such button allows you to print the page as it appears in Readability.

Doing so has its advantages, ecological as well as optical. Printing the graphic elements on a Web page can waste a lot of toner.

It bears mentioning that Readability is an option and not a default setting. In other words, if you are looking at something in it, then go to another page, the new page will not automatically open in Readability. Not a big deal, of course. (You just click it back on.)

Unfortunately the Readability plug-in does nothing with a document in PDF. Also, it will sometimes remove the name of the author from an article -- depending, presumably, on whether it is incorporated into the text or not.

That is a pain. I’m not going to complain too much, though. Readability has already saved me plenty of eyestrain. More than a gizmo, it’s become something I’d hate to be without.

***

A little more time and experimentation are required to master Evernote, but it’s worth the time. It is an impressive and appealing tool, almost certain to help you get the most out of time spent doing research online.

As with Readability, I learned of it from my wife, who is a research librarian specializing in information technology. A few months ago, she began proselytizing for it with all the fervor of a Jehovah’s Witness in possession of the latest issue of The Watchtower.

Its virtues and capacities were, so one gathered, both various and mind-boggling, though this inspired in me no undue haste to convert. (I am, remember, a man wearing t-shirts manufactured before many of today’s undergraduates were born.) But having come to recognize the sheer power of Evernote, I am now prepared to spread the good word.

It is something like a hybrid between a notebook and a filing cabinet. That’s the closest analogy that comes to mind. But it understates things quite a bit.

At its most basic level, the application allows you to take notes and organize them into files. You can attach labels to the resulting documents, and search them. But that is really just the tip of the iceberg. Evernote will also allow you to collect and store copies of web pages and articles you’ve found online, as well as PDFs, photographs, scanned documents, and audio files. You are able to add notes to those multimedia files, too, and to attach tags that will help you find them again.

An example: I am gathering ideas and references for a lecture on Bolshevik cultural policy. For the most part, this involves rereading things, but I notice almost by chance that someone is selling a portrait of Lunacharsky, the first Soviet commissar of arts and education, on eBay. A bit too expensive for my budget, alas. But thanks to Evernote I can grab the image and store it in the working file alongside quotations from his work. And I can attach a tag that will remind me to use it as one of the slides for my talk.

Evernote allows you to share any given file with other people – by making it available to invited guests or (through a URL) the whole world. And it has at least one feature that is like something out of a spy movie: via its optical character recognition feature, you can take a photograph of text and then use Evernote to search for the words in the photo.

While having dinner at a Chinese restaurant with my technology guru, I sat dumbfounded as she took a snapshot of the menu with her BlackBerry... loaded it into Evernote... searched for the word “dumpling,” which Evernote highlighted in yellow... then forwarded the resulting phototext by email.

You can use Evernote with your desktop computer, laptop, netbook, or cell phone. Or all of the above, really, depending on what is most convenient at any given time – for you can have your files stored at Evernote.com, They are in “the cloud,” to use an expression proving we now dwell in a science-fiction landscape.

The free version of Evernote is available for downloading here. There is also a premium version costing $50 per year that I have not used. Among other things, it gives you more room for your files, and allows you to save documents in other formats, including Word. (The free version provides generous but not unlimited storage capacity.)

Evernote has some similarities to Zotero, though it gives you control over a wider variety of materials. On the other hand, Zotero is designed for scholarly use and has the capacity to locate and “grab” bibliographical data from library catalog records, while Evernote does not. (You can store such information using Evernote, of course, but Zotero is more efficient about it and knows how to export the data in various standard citation formats.) Each is a valuable research tool, and with time I will probably figure out a way to move between them.

The Web site for Evernote will give you some idea how to use it, and you can figure a lot out with a period of trial and error. But it might be worthwhile to seek out a little training. Your best bet might be to ask for help at your library, which is staffed by information-science wizards with amazing powers.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

To Her, With Love

Susan Gubar – who is retiring after a remarkable career as a teacher and writer in literature and women's studies -- was my teacher. At first glance, the claim might seem thin or self-aggrandizing, the evidence in support of it accurate but scant. I took just one class with Gubar, an undergraduate seminar at Indiana University in the fall of 1980. Three credits out of the 120 or so I earned for my bachelor’s degree. Fifteen weeks out of a student life that lasted nearly a quarter of a century.

So, no, I never took a graduate course with her, never experienced the peculiar intensity and intimacy of a dozen brilliant brats hammering away at big ideas and hoping to earn an approving "smart, very smart" from a demanding professor who delighted in the give-and-take of the seminar table. She did not chair my qualifying exam or direct my doctoral dissertation. She never tore my rough drafts to shreds, exhorting me to read more, think harder, or write more clearly. I never stayed up late grading papers for one of her lecture courses, never faced the terror of speaking in one of those big halls myself in front of one of the most dynamic lecturers in the history of teaching. I never ran to the library to track down a reference for an article she was writing, never house-sat for her, never sat through a mock interview with her in preparation for the job market. I did not teach her to quilt.

I took one class with her, and all I can say is that 30 years later I still give the class and the teacher credit for changing the course of my life. I don’t give Susan all the credit. At 21, I was ready to be inspired and transformed, to find the personal and professional paths I was meant to walk and take my first tentative steps on them, though that cheesy path metaphor makes me sound more like a Victorian heroine than the naïve and unkempt baby dyke I was at the time. In any case, I credit Susan with recognizing what was happening for me and doing everything she could to assure that the moment bore fruit.

What did that mean, in concrete terms? Well, for starters, it meant she didn’t toss me out of her office one autumn afternoon when I burst in without an appointment, pointed at her, and impetuously declared, "I want to do what you do." She sat me down, listened to me, talked to me about what realizing such an ambition would actually involve, and patiently guided me through the steps it would take to get into graduate school. She told me what schools to apply to, carefully read my personal statement, wrote in support of my application, and helped me make a decision when it came time to weigh admissions offers, including a fine one from her own department.

"Go East," she said, because she knew it would be professionally advantageous to have my advanced degrees from an institution other than my undergraduate one. I suspect she also thought it would be good for me to get out of my native state. I took her advice and landed at Rutgers in the fall of 1981, a golden moment when the English department was just beginning to recruit students to come work with the pioneering feminist critics who were there at the time, including Alicia Ostriker, Elaine Showalter, and Catharine Stimpson.

End of story, right? No big deal, eh? It’s the kind of thing we do for our students all the time. Maybe, maybe not.

This is partly a story about luck and good timing, but it is also a story about the structural conditions of public higher education, conditions that have changed significantly since my undergraduate days. I stumbled into Gubar’s class because I needed to pick up a senior seminar after deciding to add English as a second major at the end of my junior year. A friend recommended the course because she’d heard the co-author of a recently published book called The Madwoman in the Attic was a pretty good teacher. The seminar, with the rather dry-sounding title of "Feminist Expository Prose," didn’t necessarily lead one to expect life-altering encounters with radical texts and ideas. I had never even heard of Mary Wollstonecraft, and Three Guineas, the Virginia Woolf text on the syllabus, was the first Woolf I would ever read. I had never heard of Charlotte Perkins Gilman either, but her Women and Economics rocked my young world, while Elizabeth Cady Stanton’s autobiography Eighty Years and More so fascinated me that I hopped in my car over Thanksgiving break to go read the author’s letters in a library 700 miles away.

It was the excitement of that first research trip that propelled me into Susan’s office to announce that I had found my vocation. It’s not immodest to say that Susan took me seriously in part because I so obviously took her and the challenges of her course seriously. She paid attention to me in the office because I was paying attention to her in the classroom. Teaching and learning are all about such moments of recognition and exchange, the meshing of desires, intelligences, imaginations. What do you think about this passage? Lord, I don't know, but did you happen to notice this one?!?

Why write about this formative experience, though, beyond my desire to pay tribute to a great teacher and a valued friend as she steps away from the classroom? I write about it because I am concerned that the conditions of possibility for such encounters are threatened in the current economic climate of higher education. There will always be great teachers, but I fear that great teaching will be much less likely to occur as we reduce the opportunities for the kind of undergraduate learning experience I was so fortunate to have with Susan back in Bloomington all those years ago.

I note with sadness, for example, that the department from which I graduatedlike the department in which I now teach – no longer requires a senior seminar of its majors. Such small-group, research-intensive learning is now mandatory only for students enrolled in the honors programs in large humanities departments in cash-strapped public universities. (Did IU's English department have an honors program back in the early 80s? I have no idea, but I probably wouldn't have been in it, since I transferred to the school as a junior and, as previously noted, only declared an English major at the end of that year.)

I have never been one to fetishize requirements, and tend to think we have ridiculously over-structured the lives of today's undergraduates, but the reality is that if I had not had to take a senior seminar I would in all likelihood not have enrolled in Susan Gubar's class in the fall of 1980. And if I hadn't taken that course, I doubt seriously I would have formulated the insane notion of pursuing a Ph.D. in English. Yes, my mother was a high school English teacher when I was young, and I definitely inherited her passions for reading and writing, but I was never encouraged to consider an academic career. My parents thought my facility with languages and the reporter's notebook stuck in my back pocket meant they were making a down payment on my career as a foreign correspondent, though I think my father secretly hoped I would become a Broadway belter.

My point is simply this: Thirty years after my fortunate fall into a class that changed the course of my life, we've made it much harder for kids like me -- middle class, publicly educated, from non-academic families -- to have such experiences. For the upcoming fall semester, my department has exactly one undergraduate seminar on the schedule. It has 20 seats, all reserved for students in the honors program. Ten years ago, the department had six such courses on the fall schedule, each with 18 seats, open to all majors. I understand the brutal economic and institutional conditions that have dictated that shift, but I still can't help worrying about the 88 lost opportunities for students to stumble unwittingly into the delights of concentrated research or to have a close encounter with a faculty member that flicks on a switch they didn't even know they had.

I am sure that if I had only had the opportunity to take one of Susan's large lecture courses I still would have had a thrilling intellectual experience, but it's hard to imagine it would have had the same transformative impact as that magical seminar with the dry-sounding title. It's hard to imagine that, under such circumstances, she would have known me well enough to take seriously my passionate yet inchoate desire to "do what you do." I grabbed the apple and ate hungrily from the tree of knowledge, but the English department made sure I walked into the bounteous, well-tended garden of its roster of seminars.

After attending the symposium held to honor Susan upon her retirement, I walked through the streets of Bloomington for the first time in many years, still trying to absorb the marvelous stories and reflections I had heard the day before of her decades of accomplishment both in and out of the classroom. I felt proud and grateful to be able to say, with so many others, that Susan Gubar was my teacher. She still is, of course, and in all the ways that matter she always will be. I can never repay what I feel I owe her, but, in honor of her and for the sake of the eager 21-year-old kid I will always be in her eyes, I promise I will never stop working to assure that today's and tomorrow's students have access to the same kinds of life-altering learning opportunities I happened upon thirty years ago. My teacher taught me too well for me to dream of anything less.

Thank you, Susan -- for everything.

Author/s: 
Marilee Lindemann
Author's email: 
info@insidehighered.com

Marilee Lindemann is associate professor of English and director of Lesbian, Gay, Bisexual, and Transgender Studies at the University of Maryland at College Park. A version of this essay first appeared on her blog, Roxie’s World.

Why Grading Is Part of My Job

It's May again. The flowers are growing, the birds are singing, and I’m getting ready to comment on my last stack of student papers of the term. When I finish, I’ll assign my students their grades. I’d love to be able to skip that last task and wish them all good luck, so it was with great interest that I read about Professor Cathy Davidson’s bold experiment with having her students grade one another. Let me say first that I'm all for the experimentation and the creative study of learning that Davidson is doing at Duke University, and I’ve long been interested in innovative teaching by Davidson’s former colleague Jane Tompkins (who also tried student self-grading) and research by educators like Alfie Kohn, who argues that competition interferes with the learning process. I admire Davidson’s scholarship, and I’ll look forward to her findings.

But Davidson, Kohn, and others can’t increase the number of spots available at medical schools, and they can’t allot a company more job openings than its revenue allows. Those entities depend on professors for our judgment of students, and until we can come up with a different way to apportion limited resources, we have to work within the system we have.

Grading certainly has its problems, and I’ve never met a teacher who enjoyed it. But just as Winston Churchill described democracy as "the worst form of government" except for all the others, so too with grading.

Let me put it more directly. I think avoiding grading (or some comparable form of rigorous evaluation by the instructor) shirks necessary responsibility, avoids necessary comparison, and puts the humanities at even greater risk of bring branded "soft" than they already face.

It doesn’t surprise me that 15 of Davidson’s 16 students signed off on others' work, eventually entitling them to As. Such an outcome brings to mind Garrison Keillor’s description of Lake Wobegon as a community where "all the children are above average."

The bottom line question is this: if everyone gets As, does that mean that Yale Law School will simply accept them all?

If an average class grade is an A, then graduate and professional schools will have to look elsewhere to find out how applicants differ. If I were an admissions officer, the first place I’d look would be to other courses with wider grade distributions, where the instructors rank and compare. Those other courses would weigh more heavily, and the professors who teach them would gain disproportionate influence in the decision process. Put simply, Professor Davidson’s colleagues who grade their students would be helping them more than she would.

Perhaps Davidson plans to make distinctions in the recommendations that she’ll write for the students when they apply for professional schools and jobs. But isn't that the grading that she was supposed to be avoiding in the first place, now done in secret? Davidson’s practice also fuels grade inflation, which disproportionately harms a college’s best students by devaluing their high marks. We need to be wary of such trends, and many colleges already are. Harvard recently moved to limit the percentage of its students who graduate with honors, which had swollen to a watery seventy-plus percent. Columbia University includes on a student’s transcript the percentage of students who got As in each class that the student took. Dartmouth and McGill are two universities that also contextualize their students’ grades. These elite institutions want to create a basis for discernment.

That discernment is personal, and it starts in each classroom. We need to be able to say to students in effect, "You did good work, but not the best in the class." It’s a way to be fair to the students and allow them to gain from their achievements.

The goal is not, of course, to make the classroom red in tooth and claw. I work harder at creating learning communities for my undergraduate and graduate students than at anything else I do, and it’s been well worth my effort over the years. I know that I have to keep seeking new ways to do this, because I agree with Davidson, Kohn, and others that students learn better when they can share the enterprise with each other.

There’s plenty of value to Davidson’s collaborative experiment, then — but grading is still part of her job, and mine, and all professors’. If we stop doing it, colleges and universities will eventually lose the esteem of the society that funds us. The humanities, already at risk, will be the chin that absorbs the direct hit.

Parents know that our children respect us when we save our highest praise for the achievements that merit it. I’m a big fan of Cathy Davidson’s work, and I’ve taught it to my own students. But abstaining from giving grades to students isn’t one of her better ideas. I say this with all due respect — and discernment. And that’s the same respect and discernment that we owe to the work of our students.

Author/s: 
Leonard Cassuto
Author's email: 
info@insidehighered.com

Leonard Cassuto is a professor of English at Fordham University, where he was named Graduate Teacher of the Year in 2009.

Digital Students, Industrial-Era Universities

The American university, like the nation’s other major social institutions — government, banks, the media, health care — was created for an industrial society. Buffeted by dramatic changes in demography, the economy, technology, and globalization, all these institutions function less well than they once did. In today’s international information economy, they appear to be broken and must be refitted for a world transformed.

At the university, the clash between old and new is manifest in profound differences between institutions of higher education and the students they enroll. Today’s traditional undergraduates, aged 18 to 25, are digital natives. They grew up in a world of computers, Internet, cell phones, MP3 players, and social networking.

They differ from their colleges on matters as fundamental as how they conceive of and utilize physical plant and time. For the most part, universities operate in fixed locales, campuses, and on fixed calendars, semesters and quarters with classes typically set for 50 minutes, three times per week. In contrast, digital natives live in an anytime/anyplace world, operating 24 hours a day, seven days a week, unbounded by physical location.

There is also a mismatch between institutions of higher education and digital natives on the goals and dynamics of education. Universities focus on teaching, the process of education, exposing students to instruction for specific periods of time, typically a semester for a course, and four years of instruction for a bachelor’s degree; digital natives are more concerned with the outcomes of education — learning and the mastery of content, achieved in the manner of games. which is why an online game pro will never boast about how long she was at a certain level, but will talk about the level that has been reached.

Higher education and digital natives also favor different methods of instruction. Universities have historically emphasized passive means of instruction — lectures and books — while digital natives tend to be more active learners, preferring interactive, hands-on methods of learning such as case studies, field study and simulations. The institution gives preference to the most traditional medium, print, while the students favor new media — the Internet and its associated applications.

This is mirrored in a split between professors and students, who approach knowledge in very different ways. Traditional faculty might be described as hunters who search for and generate knowledge to answer questions. Digital natives by contrast are gatherers, who wade through a sea of data available to them online to find the answers to their questions. Faculty are rooted in the disciplines and depth of knowledge, while students think in increasingly interdisciplinary or a-disciplinary ways, with a focus on breadth.

Universities and students also now see students in polar fashion. Higher education focuses on the individual, captured in 1871, by President James Garfield, who famously described the ideal college as Mark Hopkins, the 19th-century president of Williams College, at one end of a log and a student on the other. Today’s digital natives are oriented more toward group learning, multiple “teachers” or learning resources, and social networking, characterized by collaboration and sharing of content. This approach is causing an ethical challenge for universities, which under certain circumstances view collaboration as cheating and content sharing as plagiarism.

These are substantial gaps, complicated by the disparities in the way colleges and digital learners see their roles in education. Higher education is provider-driven in belief and practice. That is, the university, through its faculty, determines the curriculum, the content, the instructional methods, the study materials, and the class schedule. Digital natives tend to be consumer-driven, preferring to choose, if not the curriculum and content they wish to study, then the instructional method by which they learn best, the materials they use to learn, and the schedule by which they choose to study.

So what should be done? First, we need to recognize that this is not the first time colleges and their students have been out of step. In the early 19th century, as the industrial revolution gathered momentum, colleges in the main clung stubbornly to their classical curriculums, rooted in the ancient trivium and quadrivium, and to outmoded methods of instruction. College enrollments actually declined, and numerous institutions closed their doors. Bold colleges like Union, in Schenectady, New York — among the earliest adopters of modern language, science and engineering instruction — boomed in enrollment, topping Yale and Harvard combined.

Today, with college essential in obtaining most well-paying jobs, we will not see higher education enrollments drop. However, tardiness in acting will give impetus to the growth and expansion of alternative higher education — for-profit and nontraditional educational institutions that have been more successful in offering programs better geared to digital learners and their older counterparts.

Second, it is important to ask how much colleges and universities need to change. In 1828, facing industrialization and a Connecticut legislature that disapproved of Yale’s classical curriculum, the Yale faculty responded with a report which asked, in part, whether the college needed to change a lot or a little. This, Yale’s faculty said, was the wrong question. The question to be asked, they argued, was: What is the purpose of a college? This remains the right question today.

What is certain is that higher education needs to change, because students won’t, and the digital revolution is not a passing fad. To be sure, the purposes of the university have not changed. They remain the preservation and advancement of knowledge and the education of our students for humane, productive and satisfying lives in the world in which they will live. The activities of universities will continue to be teaching, research and service.

What must change, however, is the means by which we educate the digital natives who are and will be sitting in our classrooms — employing calendars, locations, pedagogies, and learning materials consistent with ways our students learn most effectively. It means that the curriculum must meet our students where they are, not where we hope they might be or where we are. All education is essentially remedial, teaching students what they do not know. This, for example, is a generation that is stronger in gathering than hunting skills. So let the curriculum begin with breadth and move to depth. Cheating and plagiarism violate the cardinal values of the academy, so let’s make it crystal clear to our students how and why they differ from sharing and collaboration.

It doesn’t make sense anymore to tie education to a common process; a uniform amount of seat time exposed to teaching and a fixed clock is outdated. We all learn at different rates. Each of us even learns different subject matters at different rates. As a consequence, higher education must in the years ahead move away from its emphasis on teaching to learning, from its focus on common processes to common outcomes. With this shift will come the possibility of offering students a variety of ways to achieve those outcomes rooted in the ways they learn best, an approach Alverno College in Milwaukee embraced four decades ago.

This needed transformation of the American university is merely the task of taking a healthy institution and maintaining its vitality. In an information economy, there is no more important social institution than the university in its capacity to fuel our economy, our society and our minds. To accomplish these ends, the university must be rooted simultaneously in our past and our present, with its vision directed toward the future.

Traditional Colleges and Digital Students

Colleges Students
Fixed time (semesters, credits, office hours) 24/7 (anytime)
Location-bound Location-free
Provider-driven Consumer-driven
Passive learning Active learning
Abstract Concrete
Traditional media New media
Teaching (one-way instruction) Learning (interactive)
Individual (cheating) Group (collaboration)
Depth / hunters Breadth / gatherers
Author/s: 
Arthur Levine
Author's email: 
info@insidehighered.com

Arthur Levine is president of the Woodrow Wilson National Fellowship Foundation and president emeritus of Teachers College, Columbia University.

The Empathic Professor

Biological theorist Richard Dawkins writes in The Selfish Gene that if we wish "to build a society in which people cooperate generously and unselfishly towards a common good, [we] can expect little help from biological nature … because we are born selfish." Observers of academic scandal and fraudulent scholarship often attest to that. Conversely, economist Jeremy Rifkin believes "human beings are not inherently evil or intrinsically self-centered and materialistic, but are of a very different nature — an empathic one — and that all of the other drives that we have considered to be primary — aggression, violence, selfish behavior, acquisitiveness — are in fact secondary drives that flow from repression or denial of our most basic instinct."

Who is right, at least when it comes to professors?

Certainly, violence and aggression are facts of life on the typical campus, ranging from assaults, hate speech and shootings to gridiron wars ignited by tribal bonfires, beer kegs and primal weekend rituals.

As director of a journalism school at a science-oriented institution, I can attest that the empathic professor not only exists but daily displays the grace, forgiveness and tolerance usually associated with higher callings. Ours is such a calling. Who but the empathic professor, from overworked adjunct to distinguished don, can profess the same tenets of basic chemistry, composition and calculus semester upon semester until seasons blend into one career-long academic calendar, were it not for love of learning and the instilling thereof in others.

Teachers, not politicians, shape generations. It has been so since Socrates and Confucius, and ever will be. (Would that state legislatures remember that when allocating funds!)

Too often, it seems, we report the antics, crimes and shenanigans of the Dawkins educator whose selfish gene believes attaining tenure is an entitlement and filing complaints, a fringe benefit.

Within a typical week, I, as director of 50 teachers, teaching assistants and staff members, witness or experience life-changing empathy. I hear it in the open doors of colleagues advising students, or in the break room celebrating birthdays or milestones, or in the hospital visiting a colleague gravely ill but still grading.

Within that same week, of course, I hear gossip, endure factions at faculty meetings, and get anonymous letters and email. Most of my professors realize my English Ph.D. includes a specialty in textual editing, so I can cipher who sent what. (See “Such Stuff as Footnotes are Made On.")

I’m writing about the empathic professor after a week enduring the Dawkins kind, not so much to remind myself that I am surrounded by kinder colleagues as to approach the topic philosophically so that you, too, might focus as I must on the good rather than the disgruntled in our midst. Is it possible that both Dawkins and Rifkin are right, or wrong, or partially so, or more right on one day but wrong the next, especially in the Ivory Tower? I am not a postmodernist promoting truth as illusion. Rather, I am a media ethicist and communication theorist who writes about the human condition, or the inharmonious duet in our heads conveying contrary instructions about the world and our place in it.

Professors, by and large, believe in the human condition but generally do not dwell on it in their disciplines. Ethicists must. In some ways, the human condition sounds eerily like a cable network of talking heads telling us 24/7 that climate change is a political conspiracy; energy consumption, a corporate one; universal health care, a socialist plot; pandemics, a pharmaceutical one, and so on.

Or not.

Although few admit it, on most days we are paradoxical creatures who traipse in our encounters listening to cymbals of consciousness and piccolos of conscience. The former tells us, “We come into the world alone, and we leave it alone” while the latter asserts, “What is in me is in you.”

Which can be right?

Reading Inside Higher Ed, or any educational news site, we discern the chromatic scale of aggression, violence, selfish behavior and acquisitiveness and less often, the empathic tonalities of kindness, forgiveness and compassion. For better or worse, mainstream media and blogosphere reflect the human condition, what Wordsworth called the still, sad music of humanity.

As such, we are both homo avarus and homo empathicus. Avarus, Latin for “greed,” dwells in the material world; empathicus, in a more metaphysical one. Our life’s work is that of choral director attempting to harmonize them so that one enlightens the other. When we do, consciousness allows us to see the world as it actually is rather than how we would like it to be; to foresee the impact of our actions before taking them; and to assess consequences of past actions to make informed choices in the future. Only then can we meet the demands of the conscience: that we love and are loved by others; that we have meaningful relationships with others; and that we contribute to community.

In my 2005 book Interpersonal Divide: The Search for Community in a Technological Age, I write that conscience grants us peace when we realize that how we treat others determines our own well being and fulfillment. "Community," I assert, "is founded on that principle, from secular laws to religious morals."

Jeremy Rifkin writes about “empathic consciousness,” an organizing principle in his new book, The Empathic Civilization: The Race to Global Consciousness in a World in Crisis. However, when he states, "The irony is just as we are beginning to glimpse the prospect of global empathic consciousness, we find ourselves close to our own extinction," he easily could be discussing what I avow: the specter of global conscience.

Appropriately, that prospect is found in Article 1 of the United Nations’ Universal Declaration of Human Rights: “All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience [emphasis added] and should act towards one another in a spirit of brotherhood.”

In media and education, we have listened too long to the cymbals of consciousness drowning the piccolos of conscience. The more educators raise consciousness about any number of public ills, the longer we seem to debate, explicate and irritate each other rhetorically rather than conscientiously, and the closer society comes to catastrophe. Overemphasis of consciousness has resulted in the repression of global conscience, our truer nature.

Conscience acts on simple truths. It does not debate whether climate change is fact or fiction; it intuits that burning so much fossil fuel is harmful to health and hemisphere. Consider the rhetoric of consciousness before the BP oil spill in the Gulf of Mexico — offshore drilling is vital to the economy — and compare that now to the awareness that pings within us daily. Neither does conscience associate universal health care with political systems but bodily ones necessary to enjoy freedom, equality and dignity. It knows pandemics occur irregardless of corporate balance books when the balance of nature is disrupted.

As The New York Times reported in 1992, Westerners advocating progress “thought they were nearly invincible, at least as far as infectious disease was concerned. They thought they could interfere with ecosystems, and ship goods and people around the world, with little regard for the effect not only on the balance of nature, but also on their own health.”

That balance of nature is on the agenda again and will be throughout our lifetime and our students’ and their grandchildren's lifetimes. There may not be any lifetimes thereafter unless we as teachers can instruct our charges to harmonize conscience and consciousness so that the duet augurs a new era of ethical awareness of the world and our sustainable place in it.

So I will close by reminding myself as well as others at the end of a trying academic year of slashed budgets, furloughs and firings that the empathic genes of our better natures will prevail. Otherwise the campanile also tolls for us.

Author/s: 
Michael Bugeja
Author's email: 
info@insidehighered.com

Michael Bugeja directs the Greenlee School of Journalism at Iowa State University. His latest book, Vanishing Act: The Erosion of Online Footnotes and Implications for Scholarship in the Digital Age (Litwin Books), is co-authored with Daniela Dimitrova, an Iowa State colleague.

All Summer in a Day

(With apologies to Ray Bradbury. Text in italics is quoted from his short story, "All Summer in a Day")

“Ready?”

“Ready.”

“Now?”

“Soon.”

“Do the scientists really know? Will it happen today, will it?”

“Look, look; see for yourself!”

From my fourth-floor office window, I watched my students spring forth from their underground architectural studio to the plaza above, like meerkats spilling out of their dens. They came in twos and threes, cameras swinging from their necks, balancing their models as they surged out of the door, looking up at the sky expectantly.

The sun came out.

It was the color of flaming bronze and it was very large. And the sky around it was a blazing blue tile color. And the jungle burned with sunlight as the children, released from their spell, rushed out, yelling, into the springtime.

Quickly they tilted their models in the fleeting sun, capturing the shadows that they had not seen for several cloudy, rainy days.

And then—

In the midst of their running one of the girls wailed.

Everyone stopped.

The girl, standing in the open, held out her hand.

“Oh, look, look,” she said, trembling.

They came slowly to look at her opened palm.

In the center of it, cupped and huge, was a single raindrop.

She began to cry, looking at it.

They glanced quietly at the sky.

“Oh. Oh.”

A few cold drops fell on their noses and their cheeks and their mouths. The sun faded behind a stir of mist. A wind blew cold around them. They turned and started to walk back toward the underground house, their hands at their sides, their smiles vanishing away.

Then they came back inside, hopped on their laptops (not up the stairs to my office), and begged for a time extension on their assignment.

“I had to watch my brother play football this weekend.”

“Things don’t always go as we plan.”

“The forecaster said…”

I did not respond.

“But this is the day, the scientists predict, they say, they know, the sun…”

They needed sunlight for their assignment, due the next day. They needed to observe and photograph clear shadows on their architectural models, using a sundial to simulate these shadows at various times of the day and year. They’d had two weeks already, the first week and a half of which had been unremittingly sunny.

I waited a while longer. Finally, when the sun still wasn’t forthcoming, I wrote back with some constructive advice. I told them what I would do in their position, had I painted myself into that particular corner — I would use the light from a slide projector, which is less than ideal, but better than nothing. They didn’t like my suggestion. They parsed words like “partial credit” and brought out the predictable “you didn’t say that in class”. They wanted the sun, the real sun, which would redeem them and make everything all right. And at the 11th hour, it came back out.

… they were running and turning their faces up to the sky and feeling the sun on their cheeks like a warm iron; they were taking off their jackets and letting the sun burn their arms.

“Oh, it’s better than the sun lamps, isn’t it?”

“Much, much better!”

Most of them got to see the sun for just enough time to finish the assignment as intended. But I found out later just how alien the sun still was to them, and sadly, to me, though we live on Earth and not in the near-perpetual rain of Venus, like the children in Bradbury’s story.

One of my students, a girl with clear blue eyes and smooth, straight, light brown hair, came to visit me shortly after the first test. She wanted to check which questions she’d gotten wrong, since she’d done so poorly. She was frustrated that she’d focused too much on the wrong things while studying and at first I was at a loss to help her. Finally we came to a moment of enlightenment. She was surprised that I had asked her to be able to figure out where the sun would be in the sky at various times of the day and year. I had expected that she and her peers had internalized something from recording the sun’s position during their sundial exercise. In short, I had expected her to be like Margot, an earth-born girl who knew the sun by heart.

But Margot remembered.

“It’s like a penny,” she said once, eyes closed.

“No it’s not!” the children cried.

“It’s like a fire,” she said, “in the stove.”

“You’re lying, you don’t remember!” cried the children.

My student admitted that she didn’t really understand this business about the sun. While flipping through the appendixes of the textbook looking for sun path diagrams to show her, it was clear that I still didn’t really, either. I still needed to look it up. As I lay in bed that night, I dreamt up a “sun dance” that I would do in class the next week. It was designed to help the students, and me, remember where the sun is in the summer, winter, spring and fall. Because we all know it, but we all forget. Sitting in that oversized, refrigerated auditorium where my lectures are held, there’s no way we could know what the sun is doing. So in the next class, we stood up and danced:

“It’s the winter solstice. Face south. Stretch out your arms, a little forward. Your left fist is the sun, rising above the horizon to the south of east. Lift it up through the southern sky, in front of you. The angle is low; it will reach into the building. Now raise your right hand to meet it at its highest point, and arc back down to the south of west.”

“OK, now it’s the equinox. Reach your arms straight out to the sides. On the equinoxes, the sun rises directly in the east and sets in the west. It’s now higher in the sky.”

“Now it’s the summer solstice. Stretch your left arm behind you. The sun rises north of east, shines on your back, the north face, at a low angle. As it rises to its apex, it’s even higher in the sky; now you can block it with an overhang. As it sets, the north façade receives this low, western sun.”

… they squinted at the sun until tears ran down their faces, they put their hands up to that yellowness and that amazing blueness and they breathed of the fresh, fresh air and listened and listened to the silence which suspended them in a blessed sea of no sound and no motion. They looked at everything and savored everything.

But I learned, months later, that they didn’t appreciate the dancing. They complained about it to my program chair and on my course assessments, saying it was beneath them, that I was talking down to them.

“She belongs in an elementary school classroom.”

“It is unfair to assume that college classes should involve dancing.”

“No,” said Margot, falling back.

They surged about her, caught her up and bore her, protesting, and then pleading, and then crying, back into a tunnel, a room, a closet, where they slammed and locked the door. They stood looking at the door and saw it tremble from her beating and throwing herself against it. They heard her muffled cries.

Once I learned about the students’ objections, I reacted as quickly as I could. In class, I became more subdued, more opaque. I tried to show more and explain less. I stopped dancing.

Spring came, and with it, more chances for us to get out of our windowless classroom and to see firsthand the work of architects and builders who worked with the sun in a far more direct and convincing way than my abstract explanations could ever convey. I learned the hard way, like Margot, that I can’t really describe the sun. The students have to see it for themselves.

On the last day of classes, they evaluated me again.

“Your opinions are important as we make plans for this course in the future. Please be candid about what topics and experiences you felt were useful, and which ones weren’t,” I heard myself say. What I thought was the same thing all new teachers think, “I am trying to teach you in the best way I know how. Please be kind.”

They stood as if someone had driven them, like so many stakes, into the floor. They looked at each other and then looked away. They glanced out at the world that was raining now and raining and raining steadily. They could not meet each other’s glances. Their faces were solemn and pale. They looked at their hands and feet, their faces down.

“Margot.”

One of the girls said, “Well…?”

No one moved.

“Go on,” whispered the girl.

I left them there, filling out that one last set of bubbles before they were set free. For me, retreating down the corridor, it was a moment of reckoning; for them, a chore barely restraining them from running out into the May sunshine.

They walked slowly down the hall in the sound of cold rain. They turned through the doorway to the room in the sound of the storm and thunder, lightning on their faces, blue and terrible. They walked over to the closet door slowly and stood by it.

Behind the closet door was only silence.

They unlocked the door, even more slowly, and let Margot out.

Author/s: 
Elizabeth Grant
Author's email: 
info@insidehighered.com

Elizabeth Grant is an assistant professor in the College of Architecture and Urban Studies at Virginia Tech.

The iPad for Academics

Teachers and students have always been an important market for Apple — a fact made clear by the tremendous amount of spit and polish that went into the new education website the company recently unveiled. But honestly: What do Apple’s slickly produced promo videos of adorable multicultural elementary schoolers have to do with us? And just how relevant is their newly-released iPad for what we do? Do academics really need to shell out five hundred bucks for what is essentially a big iPod touch?

After having used an iPad shortly since its release I can safely say that the device — or another one like it — deserves to become an important part of the academic’s arsenal of gadgets. Choosing to plop down the money for an iPad is like Ingrid Bergman’s regret over leaving Casablanca with Humphrey Bogart. You will do it: not today, not tomorrow, but soon — and for the rest of your life.

At base the iPad is an anything box that replaces a seemingly endless plethora of other things you already own: It's a TV, a radio, an MP3 player, a compass, a flashlight, a level, a deck of cards, a calculator, a photo album, an alarm clock, a Bible, the Talmud (yes, the Talmud has been ported to the iPad)... the list goes on and on. The crucial question for academics is: What in our current arsenal will the iPad replace? After using the device, the answer surprised me: the iPad makes a lousy computer replacement, but it does a great job of replacing paper.

Let me begin by getting one thing straight: When it comes to weaning professors off of traditional computers, the iPad fails. It is simply not a good device for people who do serious productive work, whether that be reading, writing, or working with multimedia. The iPad’s on-screen keyboard simply cannot hold a candle to an actual keyboard, even for academics who are veteran texters well-versed in the use of autocomplete functions. You could get a keyboard for the iPad… but then you’d be using a netbook.

Apple deserves credit for making the thing as usable as it is, but it is still not quite there. You can browse on it, but you can’t quickly and effectively search databases. You can read e-mail messages, but it takes a tad too long to write them. The screen is much more generously sized than a cell phone… but such a comparison simply damns the iPad with faint praise. Over time the iPad may get more usable as the software improves, but its size will not. And so until the human visual field shrinks and our fingers no longer require tactile feedback, we academics will be sticking to our keyboards and screens.

Where the iPad does shine is as a paper replacement. The iPad is the long, long awaited portable PDF reader that we have hoped for. Finally, we have a device that preserves formatting and displays images, charts, and diagrams. After decades of squinting at minuscule columns of photocopied type we can now zoom in on the articles we are reading and perfectly adjust the text to the width of the screen. You can even highlight and annotate documents and then send the annotations back as notes to your computer.

True, some people do not prefer a backlit screen, but it’s great for reading at night, and despite some early evidence to the contrary, LED screens don’t cause eyestrain any more than eInk. The device is slightly heavier than the Kindle and Nook, but it is still ultra, ultra portable and ultra usable. It makes you read more and saves paper — which is clearly a good thing. Because of the iPad I’m finally untethering myself from paper files. In fact these days I’d rather buy an eBook and export the annotations to my notebook program than add another underlined book to my library — an amazing turnaround for someone who once ranted on this very website about his passion for paper.

The reason the iPad is such a great paper replacement is Apple’s app store. Devices like the Kindle sell you content from a single source and allow you to read it in a single way. The iPad, on the other hand, allows third-party developers to create (and sell) different "apps," or programs, that live on your iPad. This means developers can build better and better apps for reading PDFs, and we can use them without having to buy a new device.

Now, it is currently early days for the iPad and the software is still developing: I have to get my PDFs onto my iPad with one program, and open and read them with another. But clearly things will improve. The makers of the überbibliography program Sente are already working on an iPad app, and soon they and others will make the device even more useful. The only thing you’ll need that can’t be downloaded to the iPad to help you read documents is a stylus — that you’ll have to buy yourself, and trust me, it is actually quite useful, even on a "magically" touchable device like an iPad.

That said, the revolutionary thing about the iPad is not software for reading content, but for finding (and buying) it. The iPad represents the genuine retailization of academic content. Let me explain:

Currently folks like Elsevier act as content wholesalers, selling greats bucketfuls of the stuff to libraries, who then make it available to students and professors. As journals have slowly transitioned away from paper, they have pursued business models of the "purchase this enormous bundle of journals you don’t want or else our Death Star will destroy another planet of your Rebel Alliance" variety. Individual articles are prohibitively expensive, and academics must fight through a tangled, messy mass of proxy sign-ins and authentication web pages while their IT guys make embarrassing, eye-averting administrative decisions to not think too much about the copyright of what is being posted on class Web sites.

Amazon and others have led the way in producing apps that allow you to read content across different devices: once you purchase an ebook or from Amazon you can read it on a Kindle, an iPad, a Mac, or a PC. This in turn raises the question: What would happen if journals went straight to consumers and sold articles like they were mp3s? What if you could log on to your ScienceDirect or JSTOR app and get a complete browsable list of your favorite journal articles, available for purchase for, say, 25 cents each?

Academics are ready for this development. We’ve spent years suffering from Amazon’s fiendish "get drunk and use our one-click purchase feature" to buy books online, and we often download tons of PDFs to make us feel productive. Apps with alerting and micropayment systems could provide for massive distribution that would push new issues of journal to your digital reading device. As such they offer a world where everyone can read exactly the articles they want. Individuals, not institutions, could purchase content — exactly the content they’re like, regardless of whether their library subscribes to it or not. In such a system publishers might object that piracy would be a concern, but honestly: If you’re selling content to universities that license it to tens of thousands of students living in highly-networked dorm rooms, is an app store really going to make the problem worse?

There are plenty of outlandish scenarios to imagine: professors who create specialized current content lists or anthologies of classic or cutting-edge articles, essentially filtering wholesale content and retailing it to increase their academic prestige (or even a chance to dip their beaks). Classrooms where student readers are easy to assemble and cheap — something textbook companies have tried unsuccessfully to do for some time. Librarians free to give up their increasingly restrictive role as purchasing agents and get back to old (and new!) roles of developing collections and enriching their institutions.

A key feature of the retailization of scholarly content is that it be reasonably free of digital rights management -- and here academic publishing should learn from the music industry’s failed attempts to sell copy-protected music. The more open and reusable academic content is, the more reasons people will have to buy it. The great thing about PDFs is that, like MP3s, they are not copy-protected. While some, like the Google book settlement, have sought to meter content down to the word in the name of "choice," such a move will ultimately prove equally stifling. Neither locking down our ability to move texts around nor micrometering them to death are good outcomes for the future of scholarly communication.

As an anything box, the iPad has the potential to replace a whole variety of devices that we use in our research, from voice recorders to GPS units to tuning forks. To be honest, however, I am not sure just how many niches there are here for Apple to fill. The iPad is an expensive device to take to the field, and a lot of times it just cheaper and easier to buy a tuning fork. And in addition, the app store lacks the super-deep selection of specialized programs that are currently available for normal computers.

I'm sure there are certain cases where an iPad might make a great mobile device: photographers who want to view, edit, and upload their photos on the fly, for instance. Overall, however, by splitting the difference between dedicated devices and genuine computers, the iPad doesn’t show a lot of promise as a mobile platform for research and teaching. Of course if everyone is always carrying around an iPad already then they might start replacing voice recorders. It's hard to tell. My bet is that tuning forks and compasses are not going away.

Finally, I’ve been talking about how the iPad helps academics do academe better — but does it offer the ability to do academics differently? Is this device truly "magical" in a way that will radically innovate academe?

While I can imagine some innovative pedagogic uses of the device, what academics do is still narrowly defined — and tied to institutional, political, and economic imperatives. Some imagined the Internet would cause us to rethink what it meant for a text to be coherent — and it has, to a certain extent. But really it has just reinforced our chunky, discrete notions of texts by making it easier to share PDFs and .docs. The academy might be too obdurate to be easily transformable.

At heart, an anything box like the iPad might not be such a dramatic agent for change anyway. The iPad is a chameleon, able to assume the form of other things but lacking (so far) its own unique identity. You can introduce Twitter into the classroom, but Twitter is the innovative factor here, not the iPad. It may be that someone will write the killer app for the iPad that will mutate our activities in unimaginable ways. But for now those ways remain…. unimaginable.

Indeed, it may be that the iPad is just the harbinger of some future tablet device that is yet to come. That future device might not be from Apple, but it will owe a lot to the iPad. Ultimately, academics need a world full of devices they can pour information in and out of. The more open and interoperable our new ecology of applications, devices, and content providers are, the more our learning will enrich human life — whether the people selling us our readers, software, and content are Apple, Amazon, or someone else entirely.

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub is assistant professor of anthropology at the University of Hawaii at Manoa.

Libel via YouTube

Shakespeare’s Much Ado About Nothing reveals the ways in which malicious and unfounded accusations can destroy lives, friendships, families, and institutions, including academic and military ones. During her wedding ceremony, a bride’s fiancé falsely accused her of prior promiscuity. The fiancé and his lord believed they had seen the evidence of the bride’s infidelity with their own eyes, but the evidence had been cooked by the lord’s bastard brother, who staged a misleading scene to deceive them. Besides destroying the wedding and humiliating the innocent bride, the slander led to dissension within the state and the army. It took a fool who proudly called himself an “ass“ to bring the unfounded accusation to the attention of the authorities, the fiancé, and the lord. They exemplified virtue by acknowledging and repenting their overreaction to the false accusation, thus leading to a happy ending believable only in romantic comedy. All’s well that ends well in comedy, so in this case the false accusation was indeed much ado about nothing.

But that is not always the case. Scott Jaschik’s Inside Higher Ed article, “YouTube and Context,” makes clear I was falsely accused of advocating rape in a lecture I gave on Joseph Conrad and Nicollo Machiavelli at the annual ethics conference at the U.S. Naval War College this past May. The accusation occurred via the Internet on YouTube. A sound and video bite of a little over three minutes from my lecture was posted under the headline, “Naval War College Professor Advocates Rape.” Within a few days, over two thousand viewers saw the clip, which soon attracted the attention of the Pentagon and Congress.

The only problem is that I never advocated rape, which would be crazy in any forum, especially an academic one, and most especially a military one. When an accusation sounds too crazy to believe, look again. Gender-related sensitivities in the American military going at least as far back as the infamous Navy pilots’ Tailhook groping scandal make leaders extremely careful to avoid giving offense to anyone. And indeed, I was not speaking in my own name. Instead, as revealed in the full transcript of my remarks, I was revealing why Machiavelli deserves his infamy as a “teacher of evil” because he did indeed advocate the rape not of women, but of the peoples and countries his ideal leader would subjugate. Hence the title of one of the most insightful books on Machiavelli today, Machiavelli’s Rapacious Republicanism, by an old acquaintance of mine from graduate school, the brilliant Austrian scholar, Markus Fischer.

Interpretation is not advocacy. I was interpreting Machiavelli, not advocating Machiavellianism. The person who posted the clip either did not know the difference, in which case he or she was not prepared intellectually for the thoughtful discussions of any academic institution, or did not care, in which case the individual defamed not merely me, but also my institution by deliberately taking my words out of context. As one of my senior colleagues has remarked, the YouTube post was "an act of cyberterrorism not merely against Karl Walling, but the War College itself."

Such libels are bound to be increasingly common in the YouTube age and a threat to any professor in the classroom. Any one of us could be next. How can we speak freely if we must fear that any student might post distortions of our remarks on the Internet? Can we allow video vigilantes to incite mobs in the university? Can administrators be intimidated by the vigilantes and still retain the trust and respect of faculty? Don't forget that a significant portion of world opinion believes that the lamentable events of 11 September 2001 were the result of a conspiracy in the Bush administration, or Israel, or any of a number of the usual scapegoats on libelous Internet websites, not the work of Al Qaeda. This despite the fact that Al Qaeda has claimed credit for the attack! How can we prevent the cyberterrorists from winning?

Because this is the first time my institution has had to deal with this rising threat to any academic institution, it made several rookie mistakes in handling it, but it should be those mistakes, not the individuals who made them or the institution itself, that are the issue now. My institution may be the most intellectually happening place in the American military, but we are all rookies with Internet libel. We have a common enemy in those who would attack the academy with the Internet equivalent of scribblings on bathroom walls. What can academic institutions do to prevent such mistakes in the future?

Both common sense and common courtesy would dictate informing a professor about a potentially scandalous Internet clip from his or her lecture, seminar, or other professional work, and asking for an explanation before demanding an apology or taking disciplinary action. Especially in light of the Shirley Sherrod incident, in which a conservative blogger defamed a member of the Obama administration by deliberately posting a clip from her remarks that made her seem to say the opposite of what she intended and actually said, prudence would dictate a careful investigation of the facts, including a transcript when it is available, before making hasty judgments.

Much against my own judgment, under heavy pressure, and before I saw the YouTube clip, I did issue a tepid apology, the gist of which was blame Machiavelli, not me. He after all was the one who used rape as a metaphor for leadership. Discerning members of the audience understood this, but this sensationalist farce acquired an unstoppable momentum of its own. That YouTube, since the publication of Scott Jaschik’s article, but also perhaps through requests from my institution, has withdrawn the libelous video from its site is no great consolation. The post generated at least a dozen other articles and two television stories. The effects of this false accusation will endure as long as they remain on the Internet and are unrefuted. Hence, when the facts are finally known, when they reveal the accuser has distorted a professor’s words to make him or her appear to say exactly the opposite of what the professor intended, and actually said, make the facts known widely and publicly. Just do the right thing, as the Obama administration did when it acknowledged Shirley Sherrod had been defamed.

Shirley Sherrod knows the name of her accuser, whom she reportedly intends to sue. My accuser used an anonymous e-mail address. My institution apparently has no conclusive evidence to identity him or her yet, and may never acquire it, so some thought needs to be given to how to deter libel when anonymous e-mail addresses may make posters unaccountable.

As often happens in moments of hysteria, it is sometimes tempting to blame the victim. I used the word “bitch” twice in my remarks: once in depicting the mindset of a rapist; the other time in portraying the victim’s likely attitude toward her rapist. So I was reprimanded for using offensive language, though it is not my words, but Machiavelli’s view of leadership that is truly offensive. Rape is a common metaphor for conquest and tyranny. As revealed in Chapter 25 of The Prince, in one of the most famous passages in Renaissance literature and philosophy, Machiavelli used the metaphor of the rape of poor Fortuna to reduce politics to war and war to crime. The word hubris, often translated as overweening pride, that is a common theme not merely of tragedy, but also of strategy, stems from a Greek word for rape, with hubristic characters depicted as having lost all sense of limits. Machiavelli challenged the philosophy and religion of his time by questioning whether there can be any ethical limits to strategic thought and action. Unless conferences on professional military ethics are to be mere Sunday school exercises, that question deserves serious attention from those engaged in unconventional wars, in which the customary limits of war come frequently into dispute. What better way to reveal what is most shocking in Machiavelli than to use language that approaches the limit of what is considered acceptable in our time?

It would take the comic genius of Tom Wolfe to explain how my critique of Machiavelli was twisted into the advocacy of the very crime for which I was indicting him. Not merely feminists (who can easily find at least a hundred articles on Machiavelli and feminism with a quick web search), but all decent minds should turn their anger on Machiavelli, not me, while recognizing that he was also a political and military genius, the sort both insurgents and counterinsurgents, terrorists and counterterrorists have much to learn from today. With the United States bogged down in two counterinsurgencies in Iraq and Afghanistan, understanding Machiavelli could prove very useful, if only for learning to think like our worst enemies. How can we learn from evil geniuses without becoming like our own worst enemies? That was one of the big questions of my lecture. That it was obscured by a reckless vigilante is a terrible, terrible pity.

It will take careful thought to save academics from this sort of outrage in the future. It will require a mix of technological, ethical, and institutional fixes. I do not believe it is possible any longer for individuals at my institution to post clips of lectures from its video archives without permission. So there is now a gatekeeper, though perhaps at the regrettable price that recordings of important lectures will be less freely available in the future. Whether gatekeepers are worth this price needs to be examined carefully. It may depend on circumstances.

Since anyone with a cell phone could commit the same offense, technological fixes of institutionally-controlled Internet systems will certainly not be enough. The most unsung heroes of colleges and universities are those who teach English composition. Just as they do (or should) teach rules of evidence for written citations, so too ought they teach students to apply those same rules to video citations, with students warned that plagiarism, deliberate distortions, misleading quotations, and the like are not merely unethical, but may also put them in serious legal jeopardy. My institution does not have a faculty senate, but it surely needed one in this instance to slow down the rush to judgment. Institutions that already have faculty senates might assign Internet libel cases to committees within them, which would serve both the dignity of those institutions and the rights of the accused by providing some form of due process.

And one other thing. Professors teaching Shakespeare might use Much Ado About Nothing to get students to think about why libel is a serious problem, which will help them understand why the thoughtfulness induced by careful reading of old books is relevant to our so-called information age, and perhaps our only salvation from the snap judgments that age frequently induces. Such thoughtfulness is the aim of my teaching, which, with a little drama now and then, has helped me turn on more than a few light bulbs. It would be a crime to let the cyberterrorists turn out the lights of the academy.

Author/s: 
Karl Walling
Author's email: 
info@insidehighered.com

Karl Walling is a professor of strategy at the United States Naval War College.

Missed Opportunities

William Buckley famously said he’d “rather be governed by the first 2,000 names in the Boston phonebook than by the dons of Harvard.” In my 14 years as president of a leading liberal arts college, I grew weary of overworked jokes that likened leading a faculty to herding cats or kangaroos. Looking back, I recognize in them a bit of bravado masking an awkward misalignment. Faculty are proudly autonomous, defiantly so, independent thinkers who give each other as much trouble as they give the administration when one or another of them raises a head above the herd in a gesture of leadership. Faculty are socialized as individuals, not as members of a group; taking a broader view runs against the grain for many of them, in the ways and for the reasons Hugh Heclo enumerates in his insightful book, On Thinking Institutionally. And yet the principle of “shared governance” requires a faculty capable of effective self-governance in partnership with professional administrators and a voluntary governing board.

The institution I was privileged to lead and others with which I’ve been affiliated have wonderful faculty – exceptionally engaged, responsible, and responsive in virtually every respect. Yet from the day I arrived on campus as a new president, I was schooled in a cultural norm that the better part of valor was to tiptoe around the faculty. It was as though "the faculty" as a whole was a hibernating bear no one dared disturb for fear of being mauled. I could see all the ways in which the faculty as a body – a "constituency" in academic parlance – was being watched, coddled, and handled with enormous investments of energy and studied restraint. Over time, as I became adept at reading the emotional force fields on campus, I realized that this strenuous effort was thinly masking an undercurrent of fear. And this, I have come to learn, is true to one degree or another through much of the academy.

The fear arises out of an intellectual culture that is awash in competition and critique, in picking ideas apart and taking no prisoners. Critical thinking and skepticism are the coins of the realm. But skepticism can devolve to cynicism, and criticism to contempt, an acrid brew of belligerence and disengagement that can poison morale and yield a system of self-governance far better suited to obstruction than construction. This is a pity because it matters, both educationally and strategically.

Educationally, students pay close attention to how the "grown ups" on campus behave. The academy remains arguably one of the last major sectors in American society still making a good-faith effort to both uphold and enact the view that in a healthy democracy we have obligations to one another. This includes the obligation to resolve differences by enabling the majority to form its collective judgment through meaningful discourse in which all relevant positions are fully aired. "A democracy needs citizens who can think for themselves rather than simply deferring to authority, who can reason together about their choices rather than just trading claims and counterclaims," Martha Nussbaum wrote in Cultivating Humanity.

Strategically, faculty governance bodies have pressing work to do in this era of shrinking resources and accelerating global competition. If they once routinely fostered authentic and serious public debate about real educational problems, discussion too often deteriorates, now, into something even less informative than a clash of competing claims, a spectacle more akin to disconnected “serial oratory.” At my own institution, and others I knew well, it was mystifying to see faculty members we revered for their pedagogic virtuosity – faculty who were creating in their private classrooms exquisitely hospitable venues for courageous exploration of controversial ideas – so stuck in old and unsatisfying habits when trying to resolve conflicts in the academic calendar, or come to terms with grade inflation, or revise the curriculum.

These discussions moved painfully slowly and unpredictably. Often a lone, loud voice or a mobilized minority faction would hijack the conversation in the eleventh hour. I couldn’t help but wonder, at these times, whether this would be happening if the faculty as a whole were more vividly experiencing itself weighing evidence and making wise choices on matters of curricular or educational consequence and then feeling bound to one another by their collective decisions.

Many faculty are increasingly conscious of imbalances within their own ranks, frustrations they discuss privately with deans or presidents hoping for a simple solution from on high. Rarely do they come together to explore their mutual accountabilities: to one another, to their departments, to their disciplines, and to students other than those they see directly in their own classes, offices, studios or labs. Some carry a disproportionate load for their institution as a whole, while others seem to ride more or less free. Disparities of this kind seem to be widening.

When one or another faculty member would bring an injustice or a dispute to the administration for adjudication, I often felt tempted to weigh in with what looked like decisiveness. I learned, though, that only the faculty had the power to resolve differences among themselves. The impulse that flows from perceived inequities is to tighten central controls. But that only exacerbates the problem. People who feel under surveillance resist authority, or withdraw, or both, feeding a vicious cycle: more controls, less commitment. Rather than acquiesce in the imposition of more central controls, faculty themselves would do well to shore up their own systems of citizenship, taking account of the increasing complexity of faculty work, while recognizing that the institution’s continued success will require ever greater interdependence.

In some schools, the economic downturn has brought faculty into new relationships with the administration and the trustees on budgetary decision-making, strengthening their roles in shared governance, at least for a time; in others, the reverse has occurred. As financial and competitive pressures continue to bear down on all institutions of higher learning, the incremental changes many have been making to ride out the recession – draining reserve accounts, deferring maintenance, making across-the-board budget reductions, reducing staff, relying more on contingent faculty – are likely to shift more work onto faculty shoulders and erode the quality of their work lives. If budgets have to be trimmed further, it’s hard to imagine finding additional economies without reconsidering the organization of the educational enterprise itself and the assumptions behind it: how students learn, how faculty teach, the nature of the curriculum, how everyone uses time.

I worry that the professoriate may be standing at the threshold of a shake-down as disruptive as was the restructuring of medical work that began in the 1970s when health care costs began to spiral out of control, the process that Paul Starr analyzed with such foresight in The Social Transformation of American Medicine. And I worry that colleges and universities with strong faculty – brilliant scholars, devoted teachers, radical individualists, and stubborn skeptics who treasure autonomy, resist authority, distrust power, and who love their institutions as they have known them – may find it especially difficult to bring faculty together, bring departments together and make timely, wise, informed and realistic choices about a future worth having.

Over the next decade, colleges and universities are likely to need greater flexibility, organizational resilience and openness to new ideas, and, at the same time, stronger internal systems of shared responsibility, accountability, collaboration and communication. They will need to become more fluid learning organizations, better positioned to capitalize on the forces of change, and better able to make and defend potentially divisive choices, while remaining true to the purposes that will ensure continued success.

Faculty will need to be clearer about those purposes and about the essential ingredients of the education they want their students to expect and receive – an integrative education that prepares new generations to take their places in a world of mounting complexity, interdependency, inequality ... urgency. They will need to do a better job of modeling the serious engagement of their own differences that integrative learning clearly implies and that enlightened organizational stewardship absolutely necessitates.

Author/s: 
Diana Chapman Walsh
Author's email: 
info@insidehighered.com

Diana Chapman Walsh served as president of Wellesley College from 1993 to 2007.

Fixing Higher Ed

The press and the blogosphere have devoted significant coverage recently to a report by the Georgetown University Center on Education and the Workforce that predicted that the United States is on "collision course with the future." The report estimated that within a mere eight years, the nation will suffer a shortfall of at least 3 million workers with college degrees and 4.7 million workers with postsecondary certificates. The authors of the report concluded that to meet the challenges of a global economy in which 59 to 63 percent of domestic jobs require education beyond the high-school level, America’s colleges and universities "need to increase the number of degrees they confer by 10 percent annually, a tall order."

Although numerous commentators have responded to the report by echoing its call for increased access to higher education, it seems to me that few have focused on a key term in the report’s call to "develop reforms that result in both cost-efficient and high quality postsecondary education." Producing millions more baccalaureate-educated workers will do nothing to address the competitiveness of the U.S. workforce if those degrees are not high quality ones. Sadly, it is pretty clear that far too many college degrees aren’t worth the paper on which they are printed.

In 2006, the Spellings Commission reported disturbing data that more than 60 percent of college graduates were not proficient in prose, document, and quantitative literacy. In other words, significantly more than half of college degree holders in the United States lack the “critical thinking, writing and problem-solving skills needed in today’s workplaces.”

Robert Atkinson, president of the Information Technology and Innovation Foundation, cited these findings in his recent Huffington Post essay, "The Failure of American Higher Education." He shared stories about recent college graduates, many from prestigious universities, who had applied for jobs at his think tank who were unable to complete basic tasks such as summarizing a person’s credentials into a short biographical sketch or calculating an average using a spreadsheet. Atkinson argues that one of the primary reasons for the inability of so many college graduates to think, write, speak, argue, research, or compute proficiently is that colleges “are focused on teaching kids content, not on teaching them skills.” His explanation for this is that members of the professoriate are not interested in teaching these important skills, but rather are interested in exploring the content of the subject matter in which they specialize. Atkinson then advocates several "solutions" to his perception of the problem, which include a requirement that all college graduates take a national test to measure skills competencies and “radical experimentation” in college design that focuses “on teaching 21st century skills, not 20th century subjects.” These ideas are typical of the well-intentioned but misinformed suggestions that abound these days about higher education.

The commentators are correct that there is a mismatch between what faculty members are doing and could be doing to teach students. But the problem isn't a lack of faculty interest in students, but a broader set of staggering challenges facing professors – challenges that deserve more attention.

First, college and university faculty members often lack the ability to teach basic reading, writing, and math skills. Why? Because most professors are not trained to do so. With few exceptions, doctoral programs focus on teaching disciplinary content and methods of inquiry, not pedagogy. Even in universities that provide their doctoral students with a "preparing future faculty" program to help Ph.D. candidates develop some teaching skills, such programs focus on teaching and learning at the college level, not on basic reading comprehension, the fundamentals of composition, or elementary quantitative skills. The K-12 educational system is supposed to teach these abilities. By the time students get to college, faculty members rightfully expect that they will already know how to calculate an average or summarize the main points of a newspaper article, a book chapter, or a journal article. Accordingly, faculty members see their role as then honing students’ critical thinking abilities within the context of analyzing, synthesizing, and evaluating information, often within a disciplinary framework.

These assumptions were fair ones once upon a time. Sadly, though, far too many students who have earned a high school diploma are unable to meet such expectations. Absent a handful of specialists in English departments, most college faculty members are simply ill-equipped to know how to teach students how to begin writing coherently. Professors expect to provide students with feedback on writing more efficiently and persuasively, not teach about tenses, subject-verb agreement, or basic punctuation. Yet, these are types of problems with which faculty routinely try to cope, at least for a while. And that leads to my second point.

Given the woefully inadequate preparedness of high school graduates to engage in college-level work, many professors quickly become burned out attempting to teach skills that they never expected they would need to teach at the postsecondary level. I have heard dozens of colleagues from across the country at different types of institutions of higher education say, "I didn’t earn a Ph.D. to teach what should have been taught in elementary and high school." Many such instructors give up; rather than teaching the skills that should have been learned before students arrive in college, they focus on content because it’s easier to do so. There is only so much that can be done over the course of a college quarter or semester. Worse yet, they fear holding students to high standards for a myriad of reasons, which is the third problem I wish to discuss.

College faculty members, especially those who are untenured, often fear setting course expectations too high, challenging students’ comfort levels too much, or being rigorous in their assessments of student performance. If students perceive a professor as being too hard, they will avoid that person's classes, which can lead to under-subscribed classes being canceled. Full-time faculty whose courses are canceled may be reassigned to less desirable duties; part-time faculty members whose classes are canceled often find themselves without any courses to teach. In addition, students often "punish" faculty members they perceive as being too demanding by evaluating them poorly at the end of a course. Because low student evaluations can lead to both tenure-track and adjunct faculty being fired, untenured professors may keep workloads at levels that students perceive to be reasonable and assess their performance more generously than may be actually deserved. Much has been written on this phenomenon as one of the leading factors contributing to the nationwide problem of grade inflation, the fourth issue I will address.

In one of the most comprehensive studies of college grading practices, Stuart Rojstaczer and Christopher Healy documented that the average grade point average at U.S. colleges and universities rose from 2.35 in the 1930s, to 2.52 in the 1950s when a bifurcating trend in public and private institutions emerged. After sharp increases in the 1970s and 1980s, GPAs currently average an astonishing 3.00 and 3.30 at public and private schools, respectively. This trend could be explained by better students achieving at ever-higher levels. But, as discussed above, that is simply not the case when more than 60 percent of college graduates are not proficient in basic reading, writing, and math. Rojstaczer and Healy contend that grade inflation surged in the 1980s with “the emergence of a consumer-based culture in higher education.” And the growth of the for-profit sector of higher education has only compounded this problem in higher education since corporate-based education is built upon the faulty premise of delivering a product (an "education" or a "degree") to paying consumers (what we used to call "students").

Professors who resist the pressures of grade inflation find themselves in the position of having to defend their rigorous teaching in a variety of forums, ranging from resolving complaints lodged against them with their department chairs to participating in pseudo-adversarial grade appeals proceedings and formal grievance hearings. Contemporary college students hold intense senses of consumer-based entitlement in which they see the default grade as an “A.” Recently, I defended a professor who had awarded a “D” to a student who, by my assessment, should have failed the course. During the heated discussion, the complaining student obnoxiously referred to the professor as “incompetent” and “unrealistic.” At one point, she said, “I pay your salaries!” I replied to her, “Then we want a raise for having to deal with snotty, entitled brats like you.”

Notably, the professor involved in this grade dispute was a tenured member of the faculty. For the reasons summarized above, untenured faculty (who comprise more than 70 percent of college instructors nationwide) may have caved in to the student’s demands and changed the student’s grade to avoid a confrontation in which the department chair became involved. But even when faculty members stand their ground, administrators often cave in to student demands because they are concerned with retention rates, time-to-degree completion statistics, complaints from helicopter parents (some of which escalate into lawsuits), and angry students who may turn into alumni who want nothing to do with their alma maters instead of happy alumni who become donors.

The recent case of Professor Dominique Homberger illustrates how college and university administrators contribute to grade inflation. The dean of her college recently removed Homberger from teaching an introductory biology course at Louisiana State University at Baton Rouge in the middle of semester after students complained about her harsh grading on the first exam in the course, even though grades on subsequent quizzes and exams were higher (students appear to have gotten the message that they really needed to up their levels of performance).

What do we do about the sad state of affairs in higher education? There are changes we could make at the college level that could go a long way in improving the quality of higher education. First, no one should be able to earn a Ph.D. and secure a faculty position in an institution of higher education who has not taken graduate-level courses that prepare them to teach effectively at the college level. Graduate education must provide the next generation of college instructors the pedagogical toolkit to be more effective teachers, as well as more effective assessors of student learning. This is especially important with regard to teaching prose, information, and quantitative literacy.

Second, professors who rely exclusively on textbooks must change their ways. Of course, there are many fine textbooks out there, but no college course should rely on a textbook exclusively. Primary source materials from scholarly books and peer-reviewed journals, as well as material from popular culture media (newspapers, magazines, blogs, films, television shows, etc.), when applicable, should be assigned to complement textbook readings. But even more importantly, professors must jettison the “supplements” provided by textbook publishers. Today, many textbooks come with canned lecture notes, study guides, exams, PowerPoint presentations, and other supplementary materials designed to make professors’ lives easier. With few exceptions, most of these materials are targeted at the lowest common denominator.

For example, canned PowerPoint presentations and study guides boil down the information in a textbook chapter to a series of bullet points. But “test bank” questions are the worst offenders. These question focus exclusively on content and are targeted at low levels of cognitive achievement in Bloom’s taxonomy of learning domains: mere recall of data or information. These assessments do not provide any basis for professors to test students’ ability to analyze, synthesize, or evaluate information in a manner that demonstrates critical thinking, writing, or problem-solving abilities.

Third, we must get serious about confronting grade inflation. College professors are not just teachers; they also should be serving as gatekeepers as generations of professors did in the past by awarding grades commensurate with student performance. For this to occur, the consumer-based culture that pervades higher education must be changed. Professors, parents, and administrators must stop coddling students. If a student is not performing satisfactorily, then college instructors must be able to award “D”s or “F”s without worrying about whether doing so will cost them their jobs. Moreover, faculty rewards policies (e.g., reappointment, tenure, promotion, merit raises, etc.) must be changed to reward professors who teach and grade with rigor.

Such assessments must focus not just on the content of professors’ courses, but also on how they develop critical thinking, writing, reasoning, and problem-solving skills. Conversely, professors who give away high grades that are not actually earned by students should not be retained. This is not to say, however, that only those professors who award As to 10 percent or fewer of their students are necessarily effective teachers. Rather, we need to develop better ways of assessing a college instructor’s performance than student evaluations and grade distributions. Reappointment, tenure, and promotion decisions should be based on holistic assessments which include qualitative evaluations by several peers who have observed the instructor teach and on teaching portfolios containing exams, writing assignments, grading rubrics, cooperative learning exercises, and the like. Rigor and transparency should be rewarded.

Finally, to effectively combat both grade inflation and a consumer-based culture in the college student–professor dynamic, politicians, accrediting bodies, and senior administrators must stop worrying about graduation rates and time-to-degree-completion. These artificial metrics miss the mark. The obsessive focus on what percentage of students graduate in four or six years only reinforces grade inflation and a consumer-based culture in higher education. If it takes a student eight years to graduate because professors actually hold that student to high levels of achievement before certifying that student as worthy of a degree, so be it! That, at least, would help to restore the value of a college degree rather than perpetuating the disturbing trend of the past few decades in which the value of the baccalaureate degree has deservedly diminished.

Author/s: 
Henry F. Fradella
Author's email: 
info@insidehighered.com

Henry F. Fradella is professor and chair of criminal justice at California State University at Long Beach .

Pages

Subscribe to RSS - Life
Back to Top