Survival of the Disciplines

First they came for the religious studies scholars and the geologists, and I posted comments on a couple of blogs. Then they came for the film studies people and the comparative littérateurs, and I briefly considered joining a Facebook group in protest. Then they came for the paleographers and computational linguists, and I signed a petition. Hold on, let me see who's banging on the door at this hour...

As long as I've been paddling around in academia -- i.e., since my father got his master's degree -- tenure has been the flagpole on which academic freedom has flown. It was all about protecting individuals from the pressures that the status quo puts on forward-thinking research.

Underlying that approach is the assumption that all areas of study are important, although individual arguments and conclusions may not be. But the recent developments at the University of Florida, the University of Iowa, King's College London, Washington State University, USC, and a host of other institutions reflect a new model of limiting academic inquiry, one that sidesteps the protections of tenure altogether.

The script seems to be the same everywhere. Go after the whole discipline, making sure to pay unctuous lip service to its importance and excellence. Make the point that ITTET (In These Tough Economic Times), colleges now have to be selective about what fields they can (read: deign to) support. Throw gobbets of meat to the angry students. Dodge the faculty as much as possible, and when you can't, turn them against each other by insisting that some program will have to go, and who would they load into the tumbrels instead? Use the word “painful” in every sentence.

I have no issue per se with specialization; most institutions can't have a program in every possible discipline. But over and over again, we're seeing an emphasis on STEM fields -- science, technology, engineering, and mathematics. The more you read about the STEM initiative, the scarier it gets. "STEM is the indicator of a healthy society"; "STEM is the key to future success"; STEM is the only thing that will keep us from living in refrigerator boxes under the freeway and eating our young. And just look at the signatories: Those are the institutions that have committed to prioritizing the sciences over the humanities and the social sciences. Goodbye, liberal arts – it’s been fun, but now it’s time to get serious.

You will already have noticed that S, T, E, and M are not just the fields that bring in the money but also the fields that prefer to assign as small a role to interpretation as possible. Of course scientific data require human interpretation, but all the STEM-mers I know believe that their fields deal in right and wrong answers. My colleague in the math department informs me that her discipline involves no interpretation whatsoever. And this is just as it should be; the natural world can refute hypotheses with tremendous clarity (see under: phlogiston; blood-letting; group selection).

But right and wrong answers occupy only one side of the academic quad. And this axing of whole fields closely resembles an attack on the humanities and social sciences -- in other words, the interpretive studies. It's not a concerted attack (complex conspiracies almost never succeed), but the effect is the same: promoting black-and-white disciplines and demoting unresolvable ambiguity to the realm of the hobbyist.

The effect on literary studies seems pretty obvious to me. Criticism will disappear quickly, and we'll return to the era of Appreciation. (Can you tell that I've just been teaching my theory students about 19th-century lit crit, to show them what the formalists were reacting to?) That's not a bad thing, except that aesthetic appreciation is generally (I'm inclined to say "necessarily," but I'm not sure I can defend that claim) a very effective means of shushing minority/subaltern groups and reinforcing the dominant ideology. The D.I. sets up opaque standards of appreciation and then measures everything by them -- and anything representing a different ideology (and standard of appreciation) is dismissed. That's exactly what happened to computational linguistics at King's College London.

I'm not sure where this leaves us, aside from up the creek. Perhaps subaltern studies is the last barricade against this broadscale attack on whole classes of disciplines. After all, the subaltern is mad as hell and not going to take it any more. So too might be the medievalists, the linguists, and the rural sociologists, but we don't know jack about organizing and making our voices heard.

The English Department could be the one to turn out the lights when we go. They keep us around because they value something they call "clear writing," and they think that whatever our silly little research is about, at least we teach writing (so they don't have to). Little do they know that we also teach the careful manipulation of metaphor -- better known as propaganda and marketing. But we're obviously not practicing what we teach, or else the interpretive disciplines would be in better shape.

The same can be said for Political Science, to choose just one example in the social sciences. A physicist friend points out with some bitterness that STEM has already come up with a set of solutions (her word, not mine) for global warming. Implementing them is the problem, she notes, and that is a job for the humanities and social sciences. If we gut those areas, every problem is left half-solved.

The thought of a world without Criticism -- a culture where any problem requiring interpretation is either ignored or recast as one with a single right answer -- isn't pretty. All those claims made for STEM fields (healthy society, future success, blah blah blah) are every bit as true for the interpretive studies. I agree that we could all do with more knowledge of S, T, E, & M (I pressure all my advisees to take statistics, for a start), but a society that sees every question in terms of black and white isn't going far. At least not in an upward direction.

Meg Worley
Author's email:

Meg Worley is an assistant professor of English at Pomona College.

Put Out to Pasture

I want to believe that when I was taking my favorite professors’ classes, those great men and women were at their peak.

I was a little disconcerted, then, when an older friend recently told me about how good my hero and mentor, the critic Marvin Mudrick, had been 20 years before I had taken him. “But … but I was there at the end,” I whined to myself. For eight years (until he died in 1986), as an undergraduate and graduate student at the University of California at Santa Barbara, I took his classes or sat in on them.

Even so, I knew there were quarters and classes during that time where he was better than others. But I wanted him to have been at his best during those years and I guess he fed into that conceit himself. He would make fun of some of his own old views about books, writers, and teaching — so I believed I was taking him at his peak. He seemed to think he was at his peak.

This past week, with my 11-year-old daughter sitting in on a few of my classes during her school break, I was perhaps at my worst. She was looking at me with expectation, attentively — an encouraging, demanding student. I watched my language and I hoped the students would watch theirs, not that she hasn’t heard everything. And then the next day she was supposed to come again to my classes, but she stayed back to hang out with another professor’s daughter at a campus closer to home, and I was free, and I was in the classroom, happy to be free, aware, by this point in the semester, how far I could push the students and hoping, in a couple of them, to keep them engaged. I was funnier than I had been in a long while — telling tangential stories that then led into better conversations than we would’ve got.

“We tell you everything — what about you?” teased one student.

And that day I was not old Bob — that is, paternal, avuncular Bob — I was young Bob, the one I’ve been missing, and I was willing to tell them things about my life that I wouldn’t have told them if my daughter had been in the room.

I was younger without my daughter than with her — I was free again, and teaching like that, by my wits rather than by my deliberate, this-is-for-your-own-good friendliness and deadpan, I was better for a day.

I was better.

I’ve told a few young teachers this and a few young adjunct professors, that I was happier teaching as an adjunct than as a full-time professor. It’s something like the difference I felt as a student writing for one professor over another, or occasionally now, by writing for one publication instead of another. In one I’m loose, myself, giddy, and in the other I’m responsible and sober. I’m better unsober. I don’t drink alcohol, but I’m better and smarter when I’m funny, when my funniness loosens up the class and makes my developmental students, so very self-conscious, so very cautious, lean out a little for a look, go out on a limb.

Don’t things change for us as teachers? Don’t we have to deal with that damned aging in a way that our friends in non-teaching professions don’t? We get older, but the students stay the same age. Mr. Mudrick used to tell us, his students, that he talked about love and sex a lot in the classroom because it was the only thing we all shared an interest in. I’m not so daring or funny as he was, so I don’t go very far that way, except … sometimes.

Every semester I can still get the recent immigrants and 18-year-olds hot under the collar about William Carlos Williams’s “The Knife of the Times,” a story he wrote in the early 1930s about a potential affair between two women, who were girlhood friends, and are now middle-aged and married with families. Some of my students unashamedly express prejudices about homosexuality, but outrage as well about such affairs, and yet there are not one in five intact parental marriages in the room.

Fictional characters are somehow supposed to behave! Better than real people! Most of my non-literary students hate conflicted people, people struggling to make romantic decisions that will cripple them. Political decisions, social decisions, those are too easy, in my opinion. But dare to tell someone you love that you love her? In my experience that’s the biggest drama. Call me a Jane Austenite. But also call me old.

Aging athletes, like aging professors, also like to say that they’re better now than they used to be. But people who really pay attention know that sometimes the young superstar is best when young; that he doesn’t just get better and better as some artists do; he hits his physical peak, and, lacking steroids — are there steroids for artists or professors? — he deteriorates and becomes a coach.

As I’ve proceeded as a teacher of developmental English I’ve become, to my thinking, more like a coach, an encourager, a butt-slapper (but because we’re not on a field, I do so only metaphorically). But I was better, I think, as a young professor, someone in-between, as someone questioning what we as a classroom of friendly
strangers are doing, as the guide who occasionally stops and wonders out loud, “Where are we going and why?”

No, I’m older and so aware of time passing that I get as anxious as a sheepdog and herd them along.

Mr. Mudrick may have stayed younger by getting himself more and more aware of the constraints on him as provost of a large college program and professor. That is, he deliberately wouldn’t let himself hold back. Because he was my unorthodox model, perhaps it’s inevitable that I slide rather toward more conventionality than further away. And perhaps this has come to mind because last week I started listening again to old tape-recordings of his classes.

He was so much himself in those classes, so happy to be there, so interested in us, in our reactions to what we read, in our reactions to what he provokingly and amusingly said, that those hundreds of hours in his classes continue to make me happy. But having had those stirring experiences, on my teaching days where my students and I have slogged through something that my college or department or my own old-age practicality has decided is necessary, I despair!

And then I have a good day, and I remind me of my old self, and I know I’ve lost something.

But like an athlete on the long decline, I stick around because I really still do like the game. I grimace when I miss a good teaching moment! Like a batter missing a fat pitch, I wince, “Oh, I should’ve nailed that!” A while ago, back in the day, I would’ve! So I’m slower, more watchful and deliberate, and because I can’t afford to miss as much as I used to, I’ve become more likely to take advantage of the little things that come up and go my way.

Bob Blaisdell
Author's email:

Bob Blaisdell is a professor of English at City University of New York’s Kingsborough Community College.

Going Nodal

A year ago in October, on a Saturday morning when the sun would not show its face, a group of about 30 faculty members sat around tables in a classroom that looks out on a restored prairie. The view from this window was already interdisciplinary; this piece of land not only serves as a site for scientific research, but is also presided over by the austere profile of a limestone cairn designed by British artist Andy Goldsworthy.

Helped by a grant from the Howard Hughes Medical Institute, we came together to talk about nodes. It’s not often that language in a grant proposal captures the imagination of a campus, but this has happened with the idea of nodes. Several of our faculty members in the sciences — led by a chemist, Mark Levandoski — came up with the idea. A node is a term used in more than one field: words like boundary, equilibrium, scale, transfer, model, energy, preservation. To learn what it means in other contexts might enhance the ability to understand and explain the concept in one’s own discipline. Hence the quest to identify such concepts — or nodes — in our undergraduate curriculum, and to discover how we can teach them more effectively.

The aim is not to develop a list of must-have concepts in the sciences. Some years ago our curriculum shifted to a focus on investigative skills and processes that largely replaced coverage of specified content. Instead of making a list, we want to discover where these intersections are occurring, and capitalize on them to help students learn. In the first phase of the grant, we took advantage of time freed to enroll in each other’s classes, the better to learn what students are hearing from our colleagues. Ultimately, the plan is to draw attention to the nodes and be clear with students about complementary perspectives across disciplines. As a result of examining nodes, interdisciplinarity — the relationships between disciplines and how each constructs knowledge — would become part of what we teach, even at the introductory level.

At the Saturday retreat on the prairie, some of the initial goals were already shifting. For one thing, there was no way we could limit this idea to the sciences. At least one economist, a philosopher, and a librarian had been invited, and some of the liveliest discussion arose at their tables. At a college where every faculty member teaches the required first-year tutorial, in a campus climate that invites exploration of new technologies and proposals for team-taught seminars, we share the territory.

A biologist declared, startling the economist and physicists, “To us, equilibrium is death!” Another biologist became restless as the librarian at her table extolled the node of preservation. She thought of dusty books, and wondered what she, a molecular geneticist, could do with this node. Suddenly it came to her. Fundamental to her work is the paradox that the material of biological inheritance must resist change in order to preserve hereditary information, while also being open to change in response to new environmental and evolutionary challenges. “I can’t help but think,” she reported after the session, “that a longer, deeper discussion with a group of non-biologists about preservation would freshen the way that I think about this idea and the way that I teach it. It turns out that this concept comes up in every biology course I teach.”

The models node has already provided a basis for early efforts at coordination between our intermediate-level biology and chemistry courses. After taking a summer workshop supported by the HHMI grant, chemistry professor Steven Sieck and biology professor Shannon Hinsa-Leasure developed a plan to present students with the models of penicillin used in their two fields. On the first day of class, Steve led his students through an outline of this molecule’s synthesis, which includes about 20 different chemical reactions. Toward the end, Steve again presented the same synthesis, highlighting the fact that most of these reactions had been covered in the course. Meanwhile, students co-enrolled in Shannon’s Biology 251 studied the mechanism of action for this same molecule — how the drug inhibits the ability of bacteria to synthesize cell walls. And in both classes, students were encouraged to go and see penicillin represented in works of art featured in "Molecules That Matter" on exhibit at the college’s Faulconer Gallery.

As a dean trained in literature and writing, I recognize that nodes have been around for a long time, and that another word for them is metaphors. An influential book by George Lakoff and Mark Johnson, Metaphors We Live By (1980), asserts that all conceptual thinking relies on metaphor.

In the spring, invited to lunch with a visiting group of statisticians, I performed a small test. I asked them what they thought about the word ambiguity. They recoiled. Ambiguity is bad. It confounds data and must be expunged from survey questions. What about in my field? I let them in on the fact that literary critics find ambiguity fascinating. How else could we examine the same novel or poem for centuries, without agreeing on — or even wanting — a final, definitive account of its meaning? They began talking among themselves again, about ambiguity. Maybe it was a richer concept in their field, too, than they had realized. I sat back, relieved. I had wondered what I could talk about for a whole lunch meeting, alone in a room of statisticians, a dean from the English department who had never taken a statistics class. But there would be more than enough to fill the hour. We had just begun to explore a node.

Paula V. Smith
Author's email:

Paula V. Smith is dean of the college and vice president for academic affairs at Grinnell College.

Pride in One's Work

Throughout my 31 years in higher education, from assistant to full professor at three universities — Oklahoma State, Ohio and Iowa State — I cannot recall doing anything that produced a lingering feeling of pride.

I’m not talking about ego-related pride in a promotion or an award; you outgrow those as years pass. No. I’m talking about an act so challenging that you doubted that you could perform it but undertook it anyway as a test of character or acumen.

As an ethicist, I know that pride is a deadly sin — the deadliest, in fact, and "sin of sins" of the seven — responsible for the fall of Lucifer from heaven (and many an assistant professor from the Ivory Tower).

The pride of which I speak has certain characteristics. It is done for internal rather than external reasons, often as a barometer of validity, and requires:

  • A test of one’s talents, knowledge, research or skill beyond what is routinely achievable.
  • The witnessing of that test by others so that the specter of public failure exists.
  • Courage to go through with the test in spite of feelings of dread or potential embarrassment.

To be honest, I never have taken much pride in my work as a teacher, researcher and administrator. That’s not a boast; it’s a treadmill fact. I dislike networking socially with former students because current ones need my time and attention. By the time my research is published, I’m doing other experiments that may end up refuting former hypotheses.

I'm sharing my sense of pride today not to celebrate myself but to remind you that renewal is essential with the academy in recession. The institution will take from you without acknowledgment or reward. Over time, that may cause you to question or doubt your validity and worth.

In fairness, though, educators who test students semester after semester may neglect to test themselves on the very principles and practices they embrace in the classroom or conference room. Theory is one thing; applying it in real life, another.

I purposefully did not mention my specific challenge because I didn’t want you to dismiss my experience as journalistic. You can conjure challenges in any discipline, the range of which will vary person to person and pedagogy to pedagogy. Neither should you do anything risky that can potentially harm you or your career and then litigate because Inside Higher Ed incited you with this article. You’re an adult. Do what you will within reason and accept responsibility and consequences — that’s part of the challenge, anyway.

Here was mine: I and colleague Dennis Chamberlin, a Pulitzer Prize-winning journalist, left our posts as journalism educators and worked for a week as writer and photographer for The Des Moines Register, seeing if Watergate-era reporters could succeed in the digital newsroom after a decades-long hiatus.

Before we began, we got official sponsors for our blog, including the social network, the Washington Post’s Writer’s Group and the Association for Education in Journalism and Mass Communication. Our plan was to post daily for one week before, during and after our Register gig.

Because we grade students, we asked that the managing editor, Randy Brubaker, grade us. Did I mention that our 750 undergraduate students and 35 teachers and staff and untold alumni and donors were following our blog via shared link and RSS feed on our school’s home page?

You might wonder what prompted two educators, secure in their careers, to take that risk before such an audience, knowing the political impact could be huge, especially on Internet. For instance, I have written widely and skeptically about consumer technology. I would be tweeting and blogging in my Register experience. Moreover, in addition to our constituents at Iowa State, other blogs would be following us as we tested the unpopular hypothesis that education and industry put too high a value on new technology and too low a premium on principles.

Chamberlin and I worked in media during a highly technological period — the switch from typewriters to computers — and so understood how complicated computing was in the DOS Age of the 1970s and 80s. Today’s technology, we felt, does everything for and about you, announcing upcoming appointments, providing driving instructions for interviews, and taking dictation or photos on demand and on site.

You can read about our experiment at “My Register Experience," which begins with comparisons of technology then and now. By the end of our journey, we were among the first in media to report on “The New Poverty,” about the Middle Class that paid bills and taxes and who suddenly found themselves at homeless shelters or in need of food and medical care in a state known for both. We also wrote and shot in narrative style — with beginning, middle and end (rare in today’s reportage) — and interviewed on the street rather than in the suite, finding the unemployed at a public lake rather than at the Unemployment Office, based on the notion that Iowans with their strong work ethic needed something to do in the morning.

We also used intuition more than Global Positioning Satellite software to track the depth of the recession. Doing so we broke a big story along with one journalism principle. Bob Woodward and Carl Bernstein were known for using anonymous sources. We did, too, knowing the practice eventually became taboo (for purists, anyway) because it undermined credibility.

At that lake, we encountered an unemployed nurse who at first identified herself and then later asked for anonymity. Her personal plight was dramatic, and her reason for not being named very legitimate — it might hurt in her job search. So we honored her request.

Shortly after our story appeared in the Nov. 23, 2009, edition of the Register, Buffy Renee Lucas, an unemployed nurse with similar demographics, drove her SUV into the lake at the same shore where Chamberlin and I conducted interviews, killing herself in the accident. Of course it was assumed this was the very same nurse that we had interviewed for our story, but it wasn’t. We used the blog to clarify that after publication, appreciating the instant publication of the Internet.

We tweeted when we had an update to our story, and that integrated well with our blog, driving audience to the blog and the blog, ultimately, to the print product on its run day.

In the end we received a passing grade from the managing editor.

Much good came out of our report, with food banks replenished and even television network follow-up, culminating in free psychiatric workshops on the untreated effects of persistent unemployment.

I returned to my journalism school with new appreciation for the work that modern-day reporters do in the digital newsroom, producing content on demand. Also, our methods inspired younger journalists who wanted to practice street reporting. We learned from them the value of digital devices in meeting deadlines every login.

Months later, something deeper than pride occurred within me: validation. As a reporter and bureau manager, I had witnessed firsthand the trauma and sorrow of spot news working for United Press International. I covered serial killings, prison riots, natural disasters and uprisings on Native American reservations. Because of that, I left the newsroom for the classroom. Returning to the newsroom involved courage more than the specter of public failure; it required inner strength to silence the demons of hard news past.

As such, I remain indebted to the Register for trusting Chamberlin and me to work a week without preparation and to file a human interest story that resulted in some good and that disclosed some bad, including escalating suicide rates that correlated in part with recession.

I look back at recent awards, promotions and even published scholarship with little sense of pride. I am paid to do that. But not this, which was a statement — or maybe a punctuation point — in my career, knowing that my principles still had value and that I, as an educator, was genuine in conveying them to students and sharing them with colleagues, as I am doing now.

In closing, I encourage you to share in the comments section below or even in a submission to Inside Higher Ed how you may have tested your own talents, knowledge, research or skill beyond what you knew you could achieve, requiring courage in the wake of dread or embarrassment.

We need to hear courageous stories that inspire others in this lingering recession, as we face budget cuts and larger workloads or even furloughs, firings and program elimination. If ever there was a need for uplifting stories, it is now, reminding us why we dedicated our lives to higher education and taking pride in our work, whether or not others appreciate or even acknowledge it.

Michael Bugeja
Author's email:

Michael Bugeja, director of the Greenlee School of Journalism and Communication at Iowa State University, is author of Living Ethics Across Media Platforms and Interpersonal Divide: The Search for Community in a Technological Age.

New Digital Tools

Novelty is not, as such, a value to me. One look at my wardrobe will confirm this. But when it comes to assessing new digital tools, being resolutely un-with-it may have certain advantages. I am slow to enthusiasm, and keep well away from the cutting edge, for fear of falling off. All that really counts, to my taste, is usefulness – though simplicity has a definite appeal.

With this week’s column, I want to recommend two such tools. They are free and easy to use. And without indulging in tech boosterism, it seems fair to say that they will improve the time you spend online.

Elegance and efficiency are the defining qualities of Readability. The very name is a case in point – it tells you exactly what you are getting.

With the press of a button, Readability transform a page from any Web site – however cluttered or eyestrain-inducing – into something clean and legible. It also puts the text in large print. Although I am sufficiently far-sighted to need reading glasses, I don’t need them when using Readability.

But even a person with 20-20 vision in each eye might find Readability appealing for its aesthetic impact. It wipes out all the distractions (sidebars, ads, comments, and most graphic elements) and leaves you with pure, unadorned text.

Someone with no technological aptitude can install Readability in about five seconds. The learning curve for its use takes not much longer than that. It works in the major browsers: Internet Explorer, Firefox, and Safari. Once installed, it will create either a button in your browser’s toolbar or an “R” icon in the browser’s lower right-hand corner (what people with the lingo call its “system tray”).

When you find a Web page that you’d care to read as unadorned text, click on the Readability button . It promptly transforms the article (or blog post, or what have you) into a document that resembles a typescript in roughly 14- or 16-point characters. Graphics and photos embedded in the articles will remain, but everything else is stripped out.

To return to the original version of the article, either hit the browser’s “back” button or click the faint back arrow that floats in the upper left corner of the Readability screen. Another such button allows you to print the page as it appears in Readability.

Doing so has its advantages, ecological as well as optical. Printing the graphic elements on a Web page can waste a lot of toner.

It bears mentioning that Readability is an option and not a default setting. In other words, if you are looking at something in it, then go to another page, the new page will not automatically open in Readability. Not a big deal, of course. (You just click it back on.)

Unfortunately the Readability plug-in does nothing with a document in PDF. Also, it will sometimes remove the name of the author from an article -- depending, presumably, on whether it is incorporated into the text or not.

That is a pain. I’m not going to complain too much, though. Readability has already saved me plenty of eyestrain. More than a gizmo, it’s become something I’d hate to be without.


A little more time and experimentation are required to master Evernote, but it’s worth the time. It is an impressive and appealing tool, almost certain to help you get the most out of time spent doing research online.

As with Readability, I learned of it from my wife, who is a research librarian specializing in information technology. A few months ago, she began proselytizing for it with all the fervor of a Jehovah’s Witness in possession of the latest issue of The Watchtower.

Its virtues and capacities were, so one gathered, both various and mind-boggling, though this inspired in me no undue haste to convert. (I am, remember, a man wearing t-shirts manufactured before many of today’s undergraduates were born.) But having come to recognize the sheer power of Evernote, I am now prepared to spread the good word.

It is something like a hybrid between a notebook and a filing cabinet. That’s the closest analogy that comes to mind. But it understates things quite a bit.

At its most basic level, the application allows you to take notes and organize them into files. You can attach labels to the resulting documents, and search them. But that is really just the tip of the iceberg. Evernote will also allow you to collect and store copies of web pages and articles you’ve found online, as well as PDFs, photographs, scanned documents, and audio files. You are able to add notes to those multimedia files, too, and to attach tags that will help you find them again.

An example: I am gathering ideas and references for a lecture on Bolshevik cultural policy. For the most part, this involves rereading things, but I notice almost by chance that someone is selling a portrait of Lunacharsky, the first Soviet commissar of arts and education, on eBay. A bit too expensive for my budget, alas. But thanks to Evernote I can grab the image and store it in the working file alongside quotations from his work. And I can attach a tag that will remind me to use it as one of the slides for my talk.

Evernote allows you to share any given file with other people – by making it available to invited guests or (through a URL) the whole world. And it has at least one feature that is like something out of a spy movie: via its optical character recognition feature, you can take a photograph of text and then use Evernote to search for the words in the photo.

While having dinner at a Chinese restaurant with my technology guru, I sat dumbfounded as she took a snapshot of the menu with her BlackBerry... loaded it into Evernote... searched for the word “dumpling,” which Evernote highlighted in yellow... then forwarded the resulting phototext by email.

You can use Evernote with your desktop computer, laptop, netbook, or cell phone. Or all of the above, really, depending on what is most convenient at any given time – for you can have your files stored at, They are in “the cloud,” to use an expression proving we now dwell in a science-fiction landscape.

The free version of Evernote is available for downloading here. There is also a premium version costing $50 per year that I have not used. Among other things, it gives you more room for your files, and allows you to save documents in other formats, including Word. (The free version provides generous but not unlimited storage capacity.)

Evernote has some similarities to Zotero, though it gives you control over a wider variety of materials. On the other hand, Zotero is designed for scholarly use and has the capacity to locate and “grab” bibliographical data from library catalog records, while Evernote does not. (You can store such information using Evernote, of course, but Zotero is more efficient about it and knows how to export the data in various standard citation formats.) Each is a valuable research tool, and with time I will probably figure out a way to move between them.

The Web site for Evernote will give you some idea how to use it, and you can figure a lot out with a period of trial and error. But it might be worthwhile to seek out a little training. Your best bet might be to ask for help at your library, which is staffed by information-science wizards with amazing powers.

Scott McLemee
Author's email:

To Her, With Love

Susan Gubar – who is retiring after a remarkable career as a teacher and writer in literature and women's studies -- was my teacher. At first glance, the claim might seem thin or self-aggrandizing, the evidence in support of it accurate but scant. I took just one class with Gubar, an undergraduate seminar at Indiana University in the fall of 1980. Three credits out of the 120 or so I earned for my bachelor’s degree. Fifteen weeks out of a student life that lasted nearly a quarter of a century.

So, no, I never took a graduate course with her, never experienced the peculiar intensity and intimacy of a dozen brilliant brats hammering away at big ideas and hoping to earn an approving "smart, very smart" from a demanding professor who delighted in the give-and-take of the seminar table. She did not chair my qualifying exam or direct my doctoral dissertation. She never tore my rough drafts to shreds, exhorting me to read more, think harder, or write more clearly. I never stayed up late grading papers for one of her lecture courses, never faced the terror of speaking in one of those big halls myself in front of one of the most dynamic lecturers in the history of teaching. I never ran to the library to track down a reference for an article she was writing, never house-sat for her, never sat through a mock interview with her in preparation for the job market. I did not teach her to quilt.

I took one class with her, and all I can say is that 30 years later I still give the class and the teacher credit for changing the course of my life. I don’t give Susan all the credit. At 21, I was ready to be inspired and transformed, to find the personal and professional paths I was meant to walk and take my first tentative steps on them, though that cheesy path metaphor makes me sound more like a Victorian heroine than the naïve and unkempt baby dyke I was at the time. In any case, I credit Susan with recognizing what was happening for me and doing everything she could to assure that the moment bore fruit.

What did that mean, in concrete terms? Well, for starters, it meant she didn’t toss me out of her office one autumn afternoon when I burst in without an appointment, pointed at her, and impetuously declared, "I want to do what you do." She sat me down, listened to me, talked to me about what realizing such an ambition would actually involve, and patiently guided me through the steps it would take to get into graduate school. She told me what schools to apply to, carefully read my personal statement, wrote in support of my application, and helped me make a decision when it came time to weigh admissions offers, including a fine one from her own department.

"Go East," she said, because she knew it would be professionally advantageous to have my advanced degrees from an institution other than my undergraduate one. I suspect she also thought it would be good for me to get out of my native state. I took her advice and landed at Rutgers in the fall of 1981, a golden moment when the English department was just beginning to recruit students to come work with the pioneering feminist critics who were there at the time, including Alicia Ostriker, Elaine Showalter, and Catharine Stimpson.

End of story, right? No big deal, eh? It’s the kind of thing we do for our students all the time. Maybe, maybe not.

This is partly a story about luck and good timing, but it is also a story about the structural conditions of public higher education, conditions that have changed significantly since my undergraduate days. I stumbled into Gubar’s class because I needed to pick up a senior seminar after deciding to add English as a second major at the end of my junior year. A friend recommended the course because she’d heard the co-author of a recently published book called The Madwoman in the Attic was a pretty good teacher. The seminar, with the rather dry-sounding title of "Feminist Expository Prose," didn’t necessarily lead one to expect life-altering encounters with radical texts and ideas. I had never even heard of Mary Wollstonecraft, and Three Guineas, the Virginia Woolf text on the syllabus, was the first Woolf I would ever read. I had never heard of Charlotte Perkins Gilman either, but her Women and Economics rocked my young world, while Elizabeth Cady Stanton’s autobiography Eighty Years and More so fascinated me that I hopped in my car over Thanksgiving break to go read the author’s letters in a library 700 miles away.

It was the excitement of that first research trip that propelled me into Susan’s office to announce that I had found my vocation. It’s not immodest to say that Susan took me seriously in part because I so obviously took her and the challenges of her course seriously. She paid attention to me in the office because I was paying attention to her in the classroom. Teaching and learning are all about such moments of recognition and exchange, the meshing of desires, intelligences, imaginations. What do you think about this passage? Lord, I don't know, but did you happen to notice this one?!?

Why write about this formative experience, though, beyond my desire to pay tribute to a great teacher and a valued friend as she steps away from the classroom? I write about it because I am concerned that the conditions of possibility for such encounters are threatened in the current economic climate of higher education. There will always be great teachers, but I fear that great teaching will be much less likely to occur as we reduce the opportunities for the kind of undergraduate learning experience I was so fortunate to have with Susan back in Bloomington all those years ago.

I note with sadness, for example, that the department from which I graduatedlike the department in which I now teach – no longer requires a senior seminar of its majors. Such small-group, research-intensive learning is now mandatory only for students enrolled in the honors programs in large humanities departments in cash-strapped public universities. (Did IU's English department have an honors program back in the early 80s? I have no idea, but I probably wouldn't have been in it, since I transferred to the school as a junior and, as previously noted, only declared an English major at the end of that year.)

I have never been one to fetishize requirements, and tend to think we have ridiculously over-structured the lives of today's undergraduates, but the reality is that if I had not had to take a senior seminar I would in all likelihood not have enrolled in Susan Gubar's class in the fall of 1980. And if I hadn't taken that course, I doubt seriously I would have formulated the insane notion of pursuing a Ph.D. in English. Yes, my mother was a high school English teacher when I was young, and I definitely inherited her passions for reading and writing, but I was never encouraged to consider an academic career. My parents thought my facility with languages and the reporter's notebook stuck in my back pocket meant they were making a down payment on my career as a foreign correspondent, though I think my father secretly hoped I would become a Broadway belter.

My point is simply this: Thirty years after my fortunate fall into a class that changed the course of my life, we've made it much harder for kids like me -- middle class, publicly educated, from non-academic families -- to have such experiences. For the upcoming fall semester, my department has exactly one undergraduate seminar on the schedule. It has 20 seats, all reserved for students in the honors program. Ten years ago, the department had six such courses on the fall schedule, each with 18 seats, open to all majors. I understand the brutal economic and institutional conditions that have dictated that shift, but I still can't help worrying about the 88 lost opportunities for students to stumble unwittingly into the delights of concentrated research or to have a close encounter with a faculty member that flicks on a switch they didn't even know they had.

I am sure that if I had only had the opportunity to take one of Susan's large lecture courses I still would have had a thrilling intellectual experience, but it's hard to imagine it would have had the same transformative impact as that magical seminar with the dry-sounding title. It's hard to imagine that, under such circumstances, she would have known me well enough to take seriously my passionate yet inchoate desire to "do what you do." I grabbed the apple and ate hungrily from the tree of knowledge, but the English department made sure I walked into the bounteous, well-tended garden of its roster of seminars.

After attending the symposium held to honor Susan upon her retirement, I walked through the streets of Bloomington for the first time in many years, still trying to absorb the marvelous stories and reflections I had heard the day before of her decades of accomplishment both in and out of the classroom. I felt proud and grateful to be able to say, with so many others, that Susan Gubar was my teacher. She still is, of course, and in all the ways that matter she always will be. I can never repay what I feel I owe her, but, in honor of her and for the sake of the eager 21-year-old kid I will always be in her eyes, I promise I will never stop working to assure that today's and tomorrow's students have access to the same kinds of life-altering learning opportunities I happened upon thirty years ago. My teacher taught me too well for me to dream of anything less.

Thank you, Susan -- for everything.

Marilee Lindemann
Author's email:

Marilee Lindemann is associate professor of English and director of Lesbian, Gay, Bisexual, and Transgender Studies at the University of Maryland at College Park. A version of this essay first appeared on her blog, Roxie’s World.

Why Grading Is Part of My Job

It's May again. The flowers are growing, the birds are singing, and I’m getting ready to comment on my last stack of student papers of the term. When I finish, I’ll assign my students their grades. I’d love to be able to skip that last task and wish them all good luck, so it was with great interest that I read about Professor Cathy Davidson’s bold experiment with having her students grade one another. Let me say first that I'm all for the experimentation and the creative study of learning that Davidson is doing at Duke University, and I’ve long been interested in innovative teaching by Davidson’s former colleague Jane Tompkins (who also tried student self-grading) and research by educators like Alfie Kohn, who argues that competition interferes with the learning process. I admire Davidson’s scholarship, and I’ll look forward to her findings.

But Davidson, Kohn, and others can’t increase the number of spots available at medical schools, and they can’t allot a company more job openings than its revenue allows. Those entities depend on professors for our judgment of students, and until we can come up with a different way to apportion limited resources, we have to work within the system we have.

Grading certainly has its problems, and I’ve never met a teacher who enjoyed it. But just as Winston Churchill described democracy as "the worst form of government" except for all the others, so too with grading.

Let me put it more directly. I think avoiding grading (or some comparable form of rigorous evaluation by the instructor) shirks necessary responsibility, avoids necessary comparison, and puts the humanities at even greater risk of bring branded "soft" than they already face.

It doesn’t surprise me that 15 of Davidson’s 16 students signed off on others' work, eventually entitling them to As. Such an outcome brings to mind Garrison Keillor’s description of Lake Wobegon as a community where "all the children are above average."

The bottom line question is this: if everyone gets As, does that mean that Yale Law School will simply accept them all?

If an average class grade is an A, then graduate and professional schools will have to look elsewhere to find out how applicants differ. If I were an admissions officer, the first place I’d look would be to other courses with wider grade distributions, where the instructors rank and compare. Those other courses would weigh more heavily, and the professors who teach them would gain disproportionate influence in the decision process. Put simply, Professor Davidson’s colleagues who grade their students would be helping them more than she would.

Perhaps Davidson plans to make distinctions in the recommendations that she’ll write for the students when they apply for professional schools and jobs. But isn't that the grading that she was supposed to be avoiding in the first place, now done in secret? Davidson’s practice also fuels grade inflation, which disproportionately harms a college’s best students by devaluing their high marks. We need to be wary of such trends, and many colleges already are. Harvard recently moved to limit the percentage of its students who graduate with honors, which had swollen to a watery seventy-plus percent. Columbia University includes on a student’s transcript the percentage of students who got As in each class that the student took. Dartmouth and McGill are two universities that also contextualize their students’ grades. These elite institutions want to create a basis for discernment.

That discernment is personal, and it starts in each classroom. We need to be able to say to students in effect, "You did good work, but not the best in the class." It’s a way to be fair to the students and allow them to gain from their achievements.

The goal is not, of course, to make the classroom red in tooth and claw. I work harder at creating learning communities for my undergraduate and graduate students than at anything else I do, and it’s been well worth my effort over the years. I know that I have to keep seeking new ways to do this, because I agree with Davidson, Kohn, and others that students learn better when they can share the enterprise with each other.

There’s plenty of value to Davidson’s collaborative experiment, then — but grading is still part of her job, and mine, and all professors’. If we stop doing it, colleges and universities will eventually lose the esteem of the society that funds us. The humanities, already at risk, will be the chin that absorbs the direct hit.

Parents know that our children respect us when we save our highest praise for the achievements that merit it. I’m a big fan of Cathy Davidson’s work, and I’ve taught it to my own students. But abstaining from giving grades to students isn’t one of her better ideas. I say this with all due respect — and discernment. And that’s the same respect and discernment that we owe to the work of our students.

Leonard Cassuto
Author's email:

Leonard Cassuto is a professor of English at Fordham University, where he was named Graduate Teacher of the Year in 2009.

Digital Students, Industrial-Era Universities

The American university, like the nation’s other major social institutions — government, banks, the media, health care — was created for an industrial society. Buffeted by dramatic changes in demography, the economy, technology, and globalization, all these institutions function less well than they once did. In today’s international information economy, they appear to be broken and must be refitted for a world transformed.

At the university, the clash between old and new is manifest in profound differences between institutions of higher education and the students they enroll. Today’s traditional undergraduates, aged 18 to 25, are digital natives. They grew up in a world of computers, Internet, cell phones, MP3 players, and social networking.

They differ from their colleges on matters as fundamental as how they conceive of and utilize physical plant and time. For the most part, universities operate in fixed locales, campuses, and on fixed calendars, semesters and quarters with classes typically set for 50 minutes, three times per week. In contrast, digital natives live in an anytime/anyplace world, operating 24 hours a day, seven days a week, unbounded by physical location.

There is also a mismatch between institutions of higher education and digital natives on the goals and dynamics of education. Universities focus on teaching, the process of education, exposing students to instruction for specific periods of time, typically a semester for a course, and four years of instruction for a bachelor’s degree; digital natives are more concerned with the outcomes of education — learning and the mastery of content, achieved in the manner of games. which is why an online game pro will never boast about how long she was at a certain level, but will talk about the level that has been reached.

Higher education and digital natives also favor different methods of instruction. Universities have historically emphasized passive means of instruction — lectures and books — while digital natives tend to be more active learners, preferring interactive, hands-on methods of learning such as case studies, field study and simulations. The institution gives preference to the most traditional medium, print, while the students favor new media — the Internet and its associated applications.

This is mirrored in a split between professors and students, who approach knowledge in very different ways. Traditional faculty might be described as hunters who search for and generate knowledge to answer questions. Digital natives by contrast are gatherers, who wade through a sea of data available to them online to find the answers to their questions. Faculty are rooted in the disciplines and depth of knowledge, while students think in increasingly interdisciplinary or a-disciplinary ways, with a focus on breadth.

Universities and students also now see students in polar fashion. Higher education focuses on the individual, captured in 1871, by President James Garfield, who famously described the ideal college as Mark Hopkins, the 19th-century president of Williams College, at one end of a log and a student on the other. Today’s digital natives are oriented more toward group learning, multiple “teachers” or learning resources, and social networking, characterized by collaboration and sharing of content. This approach is causing an ethical challenge for universities, which under certain circumstances view collaboration as cheating and content sharing as plagiarism.

These are substantial gaps, complicated by the disparities in the way colleges and digital learners see their roles in education. Higher education is provider-driven in belief and practice. That is, the university, through its faculty, determines the curriculum, the content, the instructional methods, the study materials, and the class schedule. Digital natives tend to be consumer-driven, preferring to choose, if not the curriculum and content they wish to study, then the instructional method by which they learn best, the materials they use to learn, and the schedule by which they choose to study.

So what should be done? First, we need to recognize that this is not the first time colleges and their students have been out of step. In the early 19th century, as the industrial revolution gathered momentum, colleges in the main clung stubbornly to their classical curriculums, rooted in the ancient trivium and quadrivium, and to outmoded methods of instruction. College enrollments actually declined, and numerous institutions closed their doors. Bold colleges like Union, in Schenectady, New York — among the earliest adopters of modern language, science and engineering instruction — boomed in enrollment, topping Yale and Harvard combined.

Today, with college essential in obtaining most well-paying jobs, we will not see higher education enrollments drop. However, tardiness in acting will give impetus to the growth and expansion of alternative higher education — for-profit and nontraditional educational institutions that have been more successful in offering programs better geared to digital learners and their older counterparts.

Second, it is important to ask how much colleges and universities need to change. In 1828, facing industrialization and a Connecticut legislature that disapproved of Yale’s classical curriculum, the Yale faculty responded with a report which asked, in part, whether the college needed to change a lot or a little. This, Yale’s faculty said, was the wrong question. The question to be asked, they argued, was: What is the purpose of a college? This remains the right question today.

What is certain is that higher education needs to change, because students won’t, and the digital revolution is not a passing fad. To be sure, the purposes of the university have not changed. They remain the preservation and advancement of knowledge and the education of our students for humane, productive and satisfying lives in the world in which they will live. The activities of universities will continue to be teaching, research and service.

What must change, however, is the means by which we educate the digital natives who are and will be sitting in our classrooms — employing calendars, locations, pedagogies, and learning materials consistent with ways our students learn most effectively. It means that the curriculum must meet our students where they are, not where we hope they might be or where we are. All education is essentially remedial, teaching students what they do not know. This, for example, is a generation that is stronger in gathering than hunting skills. So let the curriculum begin with breadth and move to depth. Cheating and plagiarism violate the cardinal values of the academy, so let’s make it crystal clear to our students how and why they differ from sharing and collaboration.

It doesn’t make sense anymore to tie education to a common process; a uniform amount of seat time exposed to teaching and a fixed clock is outdated. We all learn at different rates. Each of us even learns different subject matters at different rates. As a consequence, higher education must in the years ahead move away from its emphasis on teaching to learning, from its focus on common processes to common outcomes. With this shift will come the possibility of offering students a variety of ways to achieve those outcomes rooted in the ways they learn best, an approach Alverno College in Milwaukee embraced four decades ago.

This needed transformation of the American university is merely the task of taking a healthy institution and maintaining its vitality. In an information economy, there is no more important social institution than the university in its capacity to fuel our economy, our society and our minds. To accomplish these ends, the university must be rooted simultaneously in our past and our present, with its vision directed toward the future.

Traditional Colleges and Digital Students

Colleges Students
Fixed time (semesters, credits, office hours) 24/7 (anytime)
Location-bound Location-free
Provider-driven Consumer-driven
Passive learning Active learning
Abstract Concrete
Traditional media New media
Teaching (one-way instruction) Learning (interactive)
Individual (cheating) Group (collaboration)
Depth / hunters Breadth / gatherers
Arthur Levine
Author's email:

Arthur Levine is president of the Woodrow Wilson National Fellowship Foundation and president emeritus of Teachers College, Columbia University.

The Empathic Professor

Biological theorist Richard Dawkins writes in The Selfish Gene that if we wish "to build a society in which people cooperate generously and unselfishly towards a common good, [we] can expect little help from biological nature … because we are born selfish." Observers of academic scandal and fraudulent scholarship often attest to that. Conversely, economist Jeremy Rifkin believes "human beings are not inherently evil or intrinsically self-centered and materialistic, but are of a very different nature — an empathic one — and that all of the other drives that we have considered to be primary — aggression, violence, selfish behavior, acquisitiveness — are in fact secondary drives that flow from repression or denial of our most basic instinct."

Who is right, at least when it comes to professors?

Certainly, violence and aggression are facts of life on the typical campus, ranging from assaults, hate speech and shootings to gridiron wars ignited by tribal bonfires, beer kegs and primal weekend rituals.

As director of a journalism school at a science-oriented institution, I can attest that the empathic professor not only exists but daily displays the grace, forgiveness and tolerance usually associated with higher callings. Ours is such a calling. Who but the empathic professor, from overworked adjunct to distinguished don, can profess the same tenets of basic chemistry, composition and calculus semester upon semester until seasons blend into one career-long academic calendar, were it not for love of learning and the instilling thereof in others.

Teachers, not politicians, shape generations. It has been so since Socrates and Confucius, and ever will be. (Would that state legislatures remember that when allocating funds!)

Too often, it seems, we report the antics, crimes and shenanigans of the Dawkins educator whose selfish gene believes attaining tenure is an entitlement and filing complaints, a fringe benefit.

Within a typical week, I, as director of 50 teachers, teaching assistants and staff members, witness or experience life-changing empathy. I hear it in the open doors of colleagues advising students, or in the break room celebrating birthdays or milestones, or in the hospital visiting a colleague gravely ill but still grading.

Within that same week, of course, I hear gossip, endure factions at faculty meetings, and get anonymous letters and email. Most of my professors realize my English Ph.D. includes a specialty in textual editing, so I can cipher who sent what. (See “Such Stuff as Footnotes are Made On.")

I’m writing about the empathic professor after a week enduring the Dawkins kind, not so much to remind myself that I am surrounded by kinder colleagues as to approach the topic philosophically so that you, too, might focus as I must on the good rather than the disgruntled in our midst. Is it possible that both Dawkins and Rifkin are right, or wrong, or partially so, or more right on one day but wrong the next, especially in the Ivory Tower? I am not a postmodernist promoting truth as illusion. Rather, I am a media ethicist and communication theorist who writes about the human condition, or the inharmonious duet in our heads conveying contrary instructions about the world and our place in it.

Professors, by and large, believe in the human condition but generally do not dwell on it in their disciplines. Ethicists must. In some ways, the human condition sounds eerily like a cable network of talking heads telling us 24/7 that climate change is a political conspiracy; energy consumption, a corporate one; universal health care, a socialist plot; pandemics, a pharmaceutical one, and so on.

Or not.

Although few admit it, on most days we are paradoxical creatures who traipse in our encounters listening to cymbals of consciousness and piccolos of conscience. The former tells us, “We come into the world alone, and we leave it alone” while the latter asserts, “What is in me is in you.”

Which can be right?

Reading Inside Higher Ed, or any educational news site, we discern the chromatic scale of aggression, violence, selfish behavior and acquisitiveness and less often, the empathic tonalities of kindness, forgiveness and compassion. For better or worse, mainstream media and blogosphere reflect the human condition, what Wordsworth called the still, sad music of humanity.

As such, we are both homo avarus and homo empathicus. Avarus, Latin for “greed,” dwells in the material world; empathicus, in a more metaphysical one. Our life’s work is that of choral director attempting to harmonize them so that one enlightens the other. When we do, consciousness allows us to see the world as it actually is rather than how we would like it to be; to foresee the impact of our actions before taking them; and to assess consequences of past actions to make informed choices in the future. Only then can we meet the demands of the conscience: that we love and are loved by others; that we have meaningful relationships with others; and that we contribute to community.

In my 2005 book Interpersonal Divide: The Search for Community in a Technological Age, I write that conscience grants us peace when we realize that how we treat others determines our own well being and fulfillment. "Community," I assert, "is founded on that principle, from secular laws to religious morals."

Jeremy Rifkin writes about “empathic consciousness,” an organizing principle in his new book, The Empathic Civilization: The Race to Global Consciousness in a World in Crisis. However, when he states, "The irony is just as we are beginning to glimpse the prospect of global empathic consciousness, we find ourselves close to our own extinction," he easily could be discussing what I avow: the specter of global conscience.

Appropriately, that prospect is found in Article 1 of the United Nations’ Universal Declaration of Human Rights: “All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience [emphasis added] and should act towards one another in a spirit of brotherhood.”

In media and education, we have listened too long to the cymbals of consciousness drowning the piccolos of conscience. The more educators raise consciousness about any number of public ills, the longer we seem to debate, explicate and irritate each other rhetorically rather than conscientiously, and the closer society comes to catastrophe. Overemphasis of consciousness has resulted in the repression of global conscience, our truer nature.

Conscience acts on simple truths. It does not debate whether climate change is fact or fiction; it intuits that burning so much fossil fuel is harmful to health and hemisphere. Consider the rhetoric of consciousness before the BP oil spill in the Gulf of Mexico — offshore drilling is vital to the economy — and compare that now to the awareness that pings within us daily. Neither does conscience associate universal health care with political systems but bodily ones necessary to enjoy freedom, equality and dignity. It knows pandemics occur irregardless of corporate balance books when the balance of nature is disrupted.

As The New York Times reported in 1992, Westerners advocating progress “thought they were nearly invincible, at least as far as infectious disease was concerned. They thought they could interfere with ecosystems, and ship goods and people around the world, with little regard for the effect not only on the balance of nature, but also on their own health.”

That balance of nature is on the agenda again and will be throughout our lifetime and our students’ and their grandchildren's lifetimes. There may not be any lifetimes thereafter unless we as teachers can instruct our charges to harmonize conscience and consciousness so that the duet augurs a new era of ethical awareness of the world and our sustainable place in it.

So I will close by reminding myself as well as others at the end of a trying academic year of slashed budgets, furloughs and firings that the empathic genes of our better natures will prevail. Otherwise the campanile also tolls for us.

Michael Bugeja
Author's email:

Michael Bugeja directs the Greenlee School of Journalism at Iowa State University. His latest book, Vanishing Act: The Erosion of Online Footnotes and Implications for Scholarship in the Digital Age (Litwin Books), is co-authored with Daniela Dimitrova, an Iowa State colleague.

All Summer in a Day

(With apologies to Ray Bradbury. Text in italics is quoted from his short story, "All Summer in a Day")





“Do the scientists really know? Will it happen today, will it?”

“Look, look; see for yourself!”

From my fourth-floor office window, I watched my students spring forth from their underground architectural studio to the plaza above, like meerkats spilling out of their dens. They came in twos and threes, cameras swinging from their necks, balancing their models as they surged out of the door, looking up at the sky expectantly.

The sun came out.

It was the color of flaming bronze and it was very large. And the sky around it was a blazing blue tile color. And the jungle burned with sunlight as the children, released from their spell, rushed out, yelling, into the springtime.

Quickly they tilted their models in the fleeting sun, capturing the shadows that they had not seen for several cloudy, rainy days.

And then—

In the midst of their running one of the girls wailed.

Everyone stopped.

The girl, standing in the open, held out her hand.

“Oh, look, look,” she said, trembling.

They came slowly to look at her opened palm.

In the center of it, cupped and huge, was a single raindrop.

She began to cry, looking at it.

They glanced quietly at the sky.

“Oh. Oh.”

A few cold drops fell on their noses and their cheeks and their mouths. The sun faded behind a stir of mist. A wind blew cold around them. They turned and started to walk back toward the underground house, their hands at their sides, their smiles vanishing away.

Then they came back inside, hopped on their laptops (not up the stairs to my office), and begged for a time extension on their assignment.

“I had to watch my brother play football this weekend.”

“Things don’t always go as we plan.”

“The forecaster said…”

I did not respond.

“But this is the day, the scientists predict, they say, they know, the sun…”

They needed sunlight for their assignment, due the next day. They needed to observe and photograph clear shadows on their architectural models, using a sundial to simulate these shadows at various times of the day and year. They’d had two weeks already, the first week and a half of which had been unremittingly sunny.

I waited a while longer. Finally, when the sun still wasn’t forthcoming, I wrote back with some constructive advice. I told them what I would do in their position, had I painted myself into that particular corner — I would use the light from a slide projector, which is less than ideal, but better than nothing. They didn’t like my suggestion. They parsed words like “partial credit” and brought out the predictable “you didn’t say that in class”. They wanted the sun, the real sun, which would redeem them and make everything all right. And at the 11th hour, it came back out.

… they were running and turning their faces up to the sky and feeling the sun on their cheeks like a warm iron; they were taking off their jackets and letting the sun burn their arms.

“Oh, it’s better than the sun lamps, isn’t it?”

“Much, much better!”

Most of them got to see the sun for just enough time to finish the assignment as intended. But I found out later just how alien the sun still was to them, and sadly, to me, though we live on Earth and not in the near-perpetual rain of Venus, like the children in Bradbury’s story.

One of my students, a girl with clear blue eyes and smooth, straight, light brown hair, came to visit me shortly after the first test. She wanted to check which questions she’d gotten wrong, since she’d done so poorly. She was frustrated that she’d focused too much on the wrong things while studying and at first I was at a loss to help her. Finally we came to a moment of enlightenment. She was surprised that I had asked her to be able to figure out where the sun would be in the sky at various times of the day and year. I had expected that she and her peers had internalized something from recording the sun’s position during their sundial exercise. In short, I had expected her to be like Margot, an earth-born girl who knew the sun by heart.

But Margot remembered.

“It’s like a penny,” she said once, eyes closed.

“No it’s not!” the children cried.

“It’s like a fire,” she said, “in the stove.”

“You’re lying, you don’t remember!” cried the children.

My student admitted that she didn’t really understand this business about the sun. While flipping through the appendixes of the textbook looking for sun path diagrams to show her, it was clear that I still didn’t really, either. I still needed to look it up. As I lay in bed that night, I dreamt up a “sun dance” that I would do in class the next week. It was designed to help the students, and me, remember where the sun is in the summer, winter, spring and fall. Because we all know it, but we all forget. Sitting in that oversized, refrigerated auditorium where my lectures are held, there’s no way we could know what the sun is doing. So in the next class, we stood up and danced:

“It’s the winter solstice. Face south. Stretch out your arms, a little forward. Your left fist is the sun, rising above the horizon to the south of east. Lift it up through the southern sky, in front of you. The angle is low; it will reach into the building. Now raise your right hand to meet it at its highest point, and arc back down to the south of west.”

“OK, now it’s the equinox. Reach your arms straight out to the sides. On the equinoxes, the sun rises directly in the east and sets in the west. It’s now higher in the sky.”

“Now it’s the summer solstice. Stretch your left arm behind you. The sun rises north of east, shines on your back, the north face, at a low angle. As it rises to its apex, it’s even higher in the sky; now you can block it with an overhang. As it sets, the north façade receives this low, western sun.”

… they squinted at the sun until tears ran down their faces, they put their hands up to that yellowness and that amazing blueness and they breathed of the fresh, fresh air and listened and listened to the silence which suspended them in a blessed sea of no sound and no motion. They looked at everything and savored everything.

But I learned, months later, that they didn’t appreciate the dancing. They complained about it to my program chair and on my course assessments, saying it was beneath them, that I was talking down to them.

“She belongs in an elementary school classroom.”

“It is unfair to assume that college classes should involve dancing.”

“No,” said Margot, falling back.

They surged about her, caught her up and bore her, protesting, and then pleading, and then crying, back into a tunnel, a room, a closet, where they slammed and locked the door. They stood looking at the door and saw it tremble from her beating and throwing herself against it. They heard her muffled cries.

Once I learned about the students’ objections, I reacted as quickly as I could. In class, I became more subdued, more opaque. I tried to show more and explain less. I stopped dancing.

Spring came, and with it, more chances for us to get out of our windowless classroom and to see firsthand the work of architects and builders who worked with the sun in a far more direct and convincing way than my abstract explanations could ever convey. I learned the hard way, like Margot, that I can’t really describe the sun. The students have to see it for themselves.

On the last day of classes, they evaluated me again.

“Your opinions are important as we make plans for this course in the future. Please be candid about what topics and experiences you felt were useful, and which ones weren’t,” I heard myself say. What I thought was the same thing all new teachers think, “I am trying to teach you in the best way I know how. Please be kind.”

They stood as if someone had driven them, like so many stakes, into the floor. They looked at each other and then looked away. They glanced out at the world that was raining now and raining and raining steadily. They could not meet each other’s glances. Their faces were solemn and pale. They looked at their hands and feet, their faces down.


One of the girls said, “Well…?”

No one moved.

“Go on,” whispered the girl.

I left them there, filling out that one last set of bubbles before they were set free. For me, retreating down the corridor, it was a moment of reckoning; for them, a chore barely restraining them from running out into the May sunshine.

They walked slowly down the hall in the sound of cold rain. They turned through the doorway to the room in the sound of the storm and thunder, lightning on their faces, blue and terrible. They walked over to the closet door slowly and stood by it.

Behind the closet door was only silence.

They unlocked the door, even more slowly, and let Margot out.

Elizabeth Grant
Author's email:

Elizabeth Grant is an assistant professor in the College of Architecture and Urban Studies at Virginia Tech.


Subscribe to RSS - Life
Back to Top