Life

No More Fancy Fonts

It’s difficult to believe now, but not so long ago, I looked forward to making up syllabuses.

Once the grand meal of the course had been structured and I’d chosen an exciting title, the syllabus design was my dessert. I took the word “design” quite literally, having fun with frames and borders, trying out different fonts, fiddling with margins.

Then, after printing out the final document, I’d sit at my kitchen table and add images saved for the purpose from old magazines, vintage catalogs, pulp advertising, obscure books, and other ephemera. Fat cherubs blowing their trumpets would announce Thanksgiving break; a skull and crossbones marked the spot of the final exam. My masterpiece was a course on the work of Edgar Allan Poe, whose syllabus was a gothic folly with a graveyard on the front page and cadaver worms crawling up the margins.

Over time, my syllabuses grew less creative. I still gave my courses what I hoped were enticing titles, and I’d usually add an image to the front page, but nothing more. In part, I was afraid my quirky designs might make the course seem less serious; I also had far less free time than I used to. But mostly, it was the number of disclaimers, caveats and addenda at the end of the syllabus that made my designs seem out of place. All these extra paragraphs made the syllabus seem less personal, and more institutional -- but then, I realized, perhaps it was time I grew up and began to toe the party line.

Those were the good old days. Now, at a different institution, I teach in a low-residency program whose courses are taught, in part, online. The institutional syllabus template is pre-provided: Times New Roman, 12-point font, 1-inch margins -- and don’t forget the “inspirational quote” at the top of the page.

The Course Description is followed by the list of Course Objectives, Learning Outcomes, Curriculum and Reading Assignments, Required Reading, Assessment Criteria and so on, all the way down to the Institute’s Plagiarism Policy and Equal Opportunity Provisions. Colleagues tell me it’s the same almost everywhere now; the syllabus is now composed mainly of long, dry passages of legalese.

I no longer design my own course titles -- or, if I do, they need to be the kind of thing that looks appropriate on a transcript, which means “Comparative Approaches to the Gothic Novel,” not “Monks, Murder and Mayhem!” There’s an extra plague in online teaching, however, in that -- at least, at the institution where I’m currently employed -- all course materials, including weekly presentations, must be submitted months in advance.

This, I’m told, is not only to ensure that books are ordered and copyrights cleared, but also for the various documents to pass along the line of administrative staff whose job includes vetting them in order to be sure no rules have been violated, then uploading them in the appropriate format. Moreover, a syllabus, we are constantly reminded, is a binding legal document; once submitted, it must be followed to the letter. Omissions or inclusions would be legitimate grounds for student complaint.

Gone, then, are the days when I could bring my class an article from that morning’s New York Times. Now, when I stumble on a story, book or film that would fit perfectly with the course I’m currently teaching, I feel depressed, not excited. I can mention it, sure, but I can’t “use” it in the class. Nor can I reorient the course in mid-stream once I get to know the students; I can’t change a core text, for example, if I find they’ve all read it before; I can’t change the materials to meet student interests or help with difficulties, as I once did without a second thought.

This is especially perplexing in online teaching, where it’s so easy to link to a video, film clip, or audio lecture. We have an institution-wide rule that such materials may not be used unless accompanied by a written transcript for the hearing impaired. When I object that there are no hearing impaired students in my small class of six, I am told that no, there are currently no students who have disclosed such an impairment. The transcripts are needed in case any of them should do so -- in which case, they would be immediately entitled to transcripts for all audio-visual material previously used in the course. Sadly, those who pay the price for this assiduous care of phantom students are the six real students in the course.

In brief, what used to be a treat is now an irksome chore.

Instead of designing a syllabus, I’m filling out a template, whose primary reader is not the student, not even the phantom potential-hearing-impaired student, but the administrators and examiners who’ll be scanning it for potential deviations from standard policy.

Sitting at my kitchen table with scissors and glue, I always felt as though the syllabus -- and, by implication, the course -- was something that came from within me, something I had literally produced, at home, with pleasure and joy.

Now, by the time the course is finally “taught” months after the template has been submitted, it feels like a stillbirth from a mechanical mother.

Author/s: 
Mikita Brottman
Author's email: 
doug.lederman@insidehighered.com

Mikita Brottman is chair of the humanities program at Pacifica Graduate Institute.

Playing Mozart on the Titanic

Scattered through the Modern Language Association’s 2009 convention were telling sessions devoted to the state of higher education. Compelling testimony was offered in small and sometimes crowded rooms about the loss of long-term central features of the discipline, from foreign language study to graduate student support to tenure track jobs for new Ph.D.'s. In many respects, the MLA’s annual meeting is more responsive to higher education’s grave crisis than the other humanities and social science disciplines that should also be part of the conversation, from anthropology to classics to history and sociology. There are simply more MLA sessions dealing with such issues than there are at other disciplinary meetings. Yet there was also throughout the MLA convention a strong sense of irrelevant business as usual, in the form of innumerable sessions devoted to traditional scholarship. There is a certain poignancy to the orchestra playing Mozart while the Titanic slips beneath the waves: We who are about to die salute our traditional high cultural commitments.

Of course we should sustain the values and the ongoing research that make humanities disciplines what they are. But the point is that the ship does not have to go down. There is action to be taken, work to be done, organizing and educating to do when faculty members and graduate students come together from around the country. Disciplinary organizations thus need to revise their priorities to confront what is proving to be a multi-year recession in higher education. As I argue in No University Is an Island, the recession is prompting destructive changes in governance, faculty status, and educational mission that will long outlast the current crisis. Because MLA’s members are already talking about these matters in scattered ways, it is time for the organization to take the lead in revising the format of its annual meeting to address the state of higher education -- and prepare its members to be effective agents -- in a much more focused, visible, and productive way. Then perhaps other disciplines will follow.

A generation ago, when the MLA’s Graduate Student Caucus sought to reform the organization, it circulated several posters at annual meetings. Most telling, I thought, was a photograph of the Titanic, captioned “Are you enjoying your assistant-ship?” It was no easy task back then convincing the average tenured MLA member that the large waves towering over our lifeboats would not be good for surfing. Now the average college teacher is no longer eligible for tenure, and the good ship humanities is already partly under water.

The MLA’s response to a changing profession was to increase the number and variety of sessions, to give convention space to both fantasy and reality. The MLA would cease to be exclusively a platform for privilege. The organization would become a big tent. Unfortunately, the big tent is looking more like a shroud. The humanities are drowning. It is time to rethink the annual meeting to make it serve a threatened profession’s needs.

Until we can secure the future of higher education, we need to be substantially focused on money and power. That, I would argue, should be the theme of the 2010 annual meeting, and the structure of the meeting should be revised to reflect that focus. Instead of simply offering incoherent variety, the MLA should emphasize large meetings on the current crisis and its implications. And I do not mean simply paper presentations, telling as local testimony can be.

Disciplinary organizations need to offer substantial training sessions -- typically running several hours each and perhaps returning for additional sessions over two or three days -- that teach their members the fundamentals of financial analysis and strategies for organizing resistance. The AAUP, for example, teaches summer workshops each year that show faculty members the difference between budgets, which are fundamentally planning documents riddled with assumptions, and financial statements, which report actual expenditures for the previous year. We work not with hypothetical budgets but with examples from a dozen universities. Attendees learn that there are virtually always pots of money not listed on a university budget at all. A budget, MLA members will benefit from learning, is essentially a narrative. It can and should be deconstructed. I expect the AAUP would be willing and able to conduct such training sessions at disciplinary meetings. Indeed we already have the PowerPoint presentations and detailed handouts we would need. We have faculty members who specialize in analyzing university finances ready to serve the MLA and other disciplinary organizations.

The AAUP could also join with the AFT and the NEA to offer workshops in the fundamentals of collective bargaining, explaining how faculty and graduate employees at a given school can create a union that meets their distinctive institutional needs and embodies their core values. We can stage scenarios that give faculty members and graduate student activists experience in negotiating contracts. And the MLA should schedule large sessions that help faculty in places where collective bargaining is impossible, to recognize that organizing to have influence over budget decisions and institutional priorities is also possible without a union. The organization should also invite the California Faculty Association to conduct a large workshop on ways to reach out to students, parents, alumni, and other citizens and rebuild public support for higher education. CFA has been running a terrific campaign toward that end. The point is to empower faculty members to be the equals, not the victims, of campus administrators.

I am urging an annual MLA meeting that promotes not only literary studies but also material empowerment, that equips the members of the profession with the skills they need to preserve an appropriate environment for teaching and research. If the MLA takes the lead in reshaping its annual meeting this way, other disciplines will follow.

Author/s: 
Cary Nelson
Author's email: 
info@insidehighered.com

Cary Nelson is president of the American Association of University Professors. He has been an MLA member for 40 years. His new book, No University Is an Island: Saving Academic Freedom, has just been published by New York University Press.

Putting the Ph.D.'s to Work

Even old news can be dismal, and that is the case at hand. For about 40 years, by my calculation, American universities have been admitting too many candidates for doctorates in the liberal arts and the social sciences and, startling attrition along the way notwithstanding, have produced too great a supply of Ph.D.'s for a dwindling demand. There are proposed remedies for this injustice that prepares people exclusively for work that will not be available to them, but I want to address a different problem. What can we do with, and for, the Ph.D.'s and those who dropped out short of the final degree that will be useful for them and, not accidentally, provide a benefit to the nation?

Those who have earned or at least pursued doctorates in the humanities or social sciences, or professional degrees in law and business, whom I want to include in my argument, have learned how to learn, how to conduct research, and in many cases have acquired a second language. Field work or study abroad may have further informed them about other cultures. Thus, although their training has been geared to turn them into replicas, if not clones, of their former professors and reportedly has not prepared them for competing in the world outside the academy, they have useful skills, which could also be marketable. The question is how to bring them to market.

My proposal is for a national program that combines some of the elements of Works Progress Administration programs from the Great Depression, the Peace Corps, and the Fulbright Awards. I mention the WPA not because we have entered another depression — so far so battered, but also so far so good — but because its various programs took the unemployed and found them work which, with some notorious exceptions, the nation needed done. And this effort included support for writers and artists. The Peace Corps and the Fulbrights, with their histories of sending Americans abroad (and bringing foreigners here as Fulbright scholars) have proven their intellectual worth, their pragmatic value, and their foreign policy bona fides. I am, however, suggesting them as models of successes, not as templates.

Volunteers for this new program, after training most plausibly sponsored by the State Department, would be sent abroad, chiefly to developing countries where they could teach at high levels, in some cases study (especially languages), and work in civil programs according to their abilities and training, for example, in court administration and in the organization of self-help associations and business start-ups. The actual work will need to be directed by the skills of the volunteers, not from an arbitrary menu of projects or by ukase, though selection of the volunteers for the program will have to contribute to the shaping of its execution.

The work, as I imagine it, would not replicate or overlap with the work of Peace Corps volunteers. First, the program would recruit from the limited pool that I have described. Second, the work needs to be white-collar — educational at a high level, administrative, or organizational; volunteers will not be making bricks or laying water pipes or teaching in primary and secondary schools. Third, depending on the interest of the host country and the volunteers, periods of service could be longer than the 27-month tour in the Peace Corps. Fourth, mastering a new "strategic" language will be a primary requirement of volunteers, no matter their specific daily work — a point I will return to shortly. Fifth, at the completion of a tour, volunteers will be encouraged to maintain the linguistic skills and the cultural information they acquired while abroad. This may be done through the kind of employment they find, ideally in government service, but industry and academe could serve as well. (I say encourage rather than require because we no longer have conscription, and the unwilling are never very happy or useful.) It seems obvious to me that banking people competent in language against a future when their skills will be needed will be a good investment.

The short-term benefits are clear enough. Like the Peace Corps and the Fulbrights, the program has the potential to increase the familiarity of a generation of young Americans with other countries, their languages and cultures. Like them, it is a way of conducting soft diplomacy in which the character of the participants could complement and, I expect, enhance our national policies and interests. Those who expand knowledge or help to improve civil institutions tend to command respect, even affection, while revealing — perhaps to the astonishment of many abroad — that Americans are not the horned minions of the Great Satan. These are the expectations that the program I suggest must have, maintained rigorously with supervision and review. I have no interest in a program that enables young or even middle-aged people to find themselves or that simply keeps them out of the job market for a few years.

The longer-term benefits are, I think, more interesting and more valuable. If, as I have said, mastery of language is a primary requirement, returning volunteers will be available who know languages that are neither widely taught nor spoken in the United States; one principle guiding the placement of the volunteers should be the importance of the languages spoken where they are posted. Thirty years after we learned, in the aftermath of the assault on the American embassy in Tehran, that the Central Intelligence Agency did not have a single Farsi speaker there, we still have intelligence and military services that sorely lack people who can speak and read the languages of Africa, Asia, and the Middle East. Joshua Keating of Foreign Policy magazine, has recently pointed out that only 13 percent of CIA employees speak a second language. He tells a more bleak story. In March of 2009, the administration wanted a "civilian surge" of 300 experts in language and administration to serve in Iraq and Afghanistan. A month later, State and USAID could not find the people, and the over-committed military commands had to find the staff.

The program I am proposing, had it been established five years ago, might have been able to provide that missing expertise, at least a good part of it. Assuming only a couple of thousand volunteers a year in the program — a number that could certainly grow as it ripens and perhaps broadens according to needs — it would be manageable and not very expensive. As benchmarks, and these are only points of departure, the practices and the budgets of the U.S. Scholars Program of the Fulbright Awards and of the Peace Corps are instructive. Both offer transportation to the host country and "maintenance" or "an allowance" based on local living costs; Fulbright expenses are higher because their scholars generally live in high-cost countries. The Peace Corps offers deferral of student loan payments and in some cases cancellation of Perkins Loans, both of which would be attractive to the volunteers I have in mind.

If, then, we want a rough-and-ready baseline of costs to fund a pilot program of, say, 1,000 volunteers to begin with, we can simply take the approximate cost per volunteer for the Peace Corps, since I envision them living in conditions more like those of Peace Corps volunteers than of Fulbright students. This offers a cost per volunteer of about $45,000. Given the number of unemployed academics, recruiting this many should not be difficult and would permit selectivity.

When the economy improves a bit, imagine some of the alumni of this program entering academe not bitter from four years of adjuncting without health insurance, but energized by new experiences, and bringing unusual combinations of knowledge to their universities. Imagine if every English or history department had someone who had recently lived in the Middle East or Africa?

There is another benefit that could actually respond to a serious, if only simmering or festering, problem heading right at us now. In the next several years, approximately 250,000 federal employees, many of them at the top as GS-15 or SES workers, will be retiring. How to replace them or, more to the point, where simply to look for their replacements, is already proving to be vexing and nerve-racking. My belief — it is more than a hunch — is that many of these returning volunteers would be interested in federal service or perhaps in service with state governments, which also are facing the same problems of baby-boomer retirements as the federal government. They will already have been exposed to the terms and the values of working as public servants. They will have acquired, at the government’s expense, new skills and may have a sense of obligation or loyalty, which would be welcome. Perhaps offering student loan forgiveness or reduction in return for government service after the tour abroad would be a strong inducement.

Many, if not all, will have the academic credentials that public agencies routinely look for. If the program I propose could be established soon and quickly grow to several thousand new recruits every year — and recruits are available now and will continue to be until we change our policies of graduate education — we would have made a respectable down payment on this human capital obligation. Instead of mortgaging our future, as many programs often appear to do, we could actually be paying down the mortgage by drawing on the skills we have banked.

Beyond the value of sending “missionaries” or soft diplomats abroad, the two additional goals I have presented — the acquisition of strategic languages and the restocking of the public sector — are distinct, but not at odds with one another, nor do I worry that having more than one goal clouds the mission or makes the program unwieldy: both are worthy and important, and neither excludes the other. Moreover, by turning to the supply of unemployed or underemployed men and women, we will be putting to work minds that have been trained and skills that have been raised at great expense. This may seem an exercise in good works and foreign policy, but no less, in my opinion, a matter of thrift and profit for the nation.

Author/s: 
Stephen Joel Trachtenberg
Author's email: 
info@insidehighered.com

Stephen Joel Trachtenberg is president emeritus and university professor of public service at George Washington University.

My Book, My Dreams

I wrote my first novel, a cross between The Last of the Mohicans and Shane, when I was eight or nine years old. I wrote it on small spiral bound notebooks and illustrated it by hand. Later I tore all the pages out of the notebooks and stapled them together in thick stacks. I wasn’t a literary prodigy, just a kid who loved Star Wars, comics and novels. I was a geek who was not afraid to dream the literary dream. In the years that followed, I continued dreaming that dream. After spending several years writing short stories and hundreds of poems, which I dutifully relegated to hardbound composition notebooks, I wrote my second novel. I was in the ninth grade and I knew how to type. I’ll never forget the thrill of typing up that manuscript. Graduating from handwriting to typed text made me feel like a very serious writer. Before I graduated high school, I wrote a third novel, and a collection of short stories, both of which I carefully typed up, copied and bound at a photocopy center.

I’ve often wondered what would have happened if I had kept that literary faith after going to college. What did happen was that I majored in literature, went to graduate school and began my career as a literary critic. The old dreams of being a writer of novels and poems were replaced by dreams of being a published literary critic, an author of scholarly articles and monographs that would draw the interest of my peers. My whole life had been about reading and self-expression, and now, as a professor of literature, I wanted – no, I needed – to express myself and be read. I began writing articles and did quite well. It was exhilarating. Then I faced the herculean task of shaping a book from the inchoate mass of my dissertation. It took me six or seven years, and two separate tenure clocks, to complete it. My book, I’m proud to say, was personal, original, and timely. I dreamed that it would be read, that it might matter. I never thought I would wonder if writing my book was really worth it.

There were many things my mentors never told me about being an academic. I was never taught how to write a book (as opposed to a dissertation), or warned about the protocols, timelines and politics of trying to get a book published. No one ever spoke to me about what it might mean to publish in a second-tier university press, get one bad review and not really be read as much as you might hope. I knew that academic writers could be stars, and that some never got published, or published bad books that no one cited. But I did not know about the vast corpus of middling, pretty-good (or better) books and authors, which for a variety of reasons, justified or not, simply don’t make much of an impact or a difference. That’s a special kind of purgatory that graduate students and assistant professors don’t hear too much about. Well, I’m something of an expert in this subject. The story of my first book is not unlike that of a long suffering, sympathetic character in a Dickens novel who quietly suffers a series of slights, injustices and betrayals, but without the cathartic redemption or resolution that sublimates her mournful journey.

The good news was that I got my book published at a university press, not a top one, but a good one with a good backlist. The bad news was that my book would not be published in affordable soft cover, but in a more expensive library edition, meaning that no graduate student would ever buy my book the way I bought so many books as a student. My book would not sit on the crowded bookshelves of a studio apartment in a college town while someone pondered a dissertation or argued the finer points of theory with some friends. But that was OK. As long as it got into libraries, that would be fine. There might not be many notes in the margins but it would still be read. Then my press required me to change the title of my book to something flatter, more descriptive, to help sell copies to libraries I suppose. My tenure clock was running out, what could I do? I let them do it. And I even made my peace with it, believing that "If you build it, they will come." I waited for them to come.

Unless you publish with a top-tier press, and your book makes a big splash, don’t expect much fanfare in the years that follow the publication of your book. There will not be release parties at conference exhibit halls, posters, “buzz” or anything like that. Very few authors get that experience. For two or three years, it was as if my book did not exist. Then three reviews appeared. One slammed me, the other one was somewhat positive, and the third was embarrassingly short and uninformative. Within the echoing silence of the publication of my first book, came the first whispers of feedback, and it was pretty clear: My book was interesting, it had glimmers, but it was mediocre. (I don’t agree with that assessment but I’m just trying to be report events as honestly as possible.) Anyway, that hurt a lot.

Then something pretty surprising happened. I realized that some of those who really should have read my book were not interested in it. The first such person was a graduate student I was advising whose dissertation intersected with my book’s subject matter. I guess she never cracked the book to notice its table of contents. Then an acquaintance of mine tried to publish a book on the exact same subject without mentioning me at all. Let’s say, for the sake of illustrating my point, that my book was the first ever and only book on hats in literature. This fellow, who knew about my book, had his own book manuscript on hats in literature and he wanted me to help him leverage its publication, despite the fact that he could not be bothered to cite me once. And still, he asked me to write the preface to his book. I said no. But several other scholars (my peers) stepped in to blurb the book. My favorite one praised his book on hats for "filling the void" on the subject matter of hats in literature.

Still, I believed in my book. It was original and different, the first book on hats in literature! I was confident people would find it out eventually, and, in the end, redeem it by mentioning it. I put a few excerpts online, and proceeded to take my scholarly interests in a new direction. It was out of my hands.

A few more years went by. Finally, the tide turned. My book started to make its way into other books and articles, sometimes in surprising, unexpected ways. Most mentions were painfully cursory, an afterthought, a professional formality. Several citations of my work made it clear that the authors had never read my book but only the excerpts I had put online. One clever peer wove together materials from my Web site with a review that was posted online and created a credible paragraph that distorted my original argument. In fact, one or two others came painfully close to attributing my "contribution" to an online reviewer who summarized my book. This is how my bid to use the Web to promote my scholarship backfired. In this age of Web research, even scholars would rather not order a book through interlibrary loan as long as they can pretend they have read it. Who was I to think that in this postmodern age, citations would be anything else than simulacra? But I digress.

There have been two substantial engagements with the contents of my book, mainly in footnotes. I was grateful and felt somewhat redeemed, but was this the best I could hope for? What did I want or need to feel like my work mattered? It’s embarrassing to answer this question but here goes: I needed someone to recognize my work in the body of their scholarship, explicitly, not via Web "CliffsNotes," or a cursory footnote. I did not need a page-long discussion of my work, that’s too much, but just something that would say my Little Dorrit of a book had existed and was deserving of being mentioned out in the open. Four sentences, out in the open would do it, and I could settle for a footnote that was longer than one line long. I could settle for a footnote containing a few lines on what I had labored over for so many years. I think that would do it for me. Really.

I’m luckier than most. My book is appearing on people’s radars. It may just be a blip, but it’s there. There are a few people interested in hats in literature, apparently. I’m not a total failure -- far from it. On the contrary, I got tenure on the shoulders of this book and recently some presses have asked me to blurb other books (like that book on baseball caps and the other one on the representation of heads in literature). It feels like a sham to be treated like someone important when my book is so marginal or superfluous. But that’s fine; I’m not going to turn down such publicity.

So, I’m an arrogant ass, or a narcissist. Let me steal the thunder of the readers of this piece. But someone needs to speak up for all the books that have been undeservedly shunted aside, maligned or marginalized. Someone needs to say what many published authors already know: Being an author is not all it’s cracked up to be. It can be a lot lonelier and painful than you might expect. You pour your life into this thing, you parch yourself dry, and then all you can squeeze back into your sandy mouth is a few drops of moisture.

I’ve moved on. I’ve adjusted my expectations. I’ve done a reality check. My book pops up here and there and my name is out there. My articles are read and cited, sometimes repeatedly. That’s a lot more than what many of my peers have achieved. A little bit of gratitude is in order. I know.

What’s hard for me now is not the reception of my first book, but the motivation to write my second one. For years I’ve been publishing articles and editing books, but the time has come to buckle down and build the centerpiece of my case for full professor. I need to motivate myself to write another book that maybe will not make much of a difference, all over again. It’s hard to work up the gumption to do that. Another part of me, however, sees it differently. Writing a book, even an academic book destined to have very few readers, is no small feat of creation. My second monograph may or may not be important to others, but it will be written with passion and integrity. If I succeed in recovering the smallest part of that nine-year-old boy who could write, happily, for himself alone, I know that my second book will be something that I can be proud of, like my first book. There’s something zen and honest about that. Almost liberating. Now I just need to make it happen, one last time.

Author/s: 
Peter Dorchester
Author's email: 
info@insidehighered.com

Peter Dorchester is the pen name of an associate professor in the humanities at a large university in the South.

Course Evaluations, Years Later

Just recently I got a set of teaching evaluations for a course that I taught in the fall of 2008 -- and another set for a course I taught in 2006.

This lag wasn't the fault of campus mail (it can be slow, but not that slow). Instead, the evaluations were part of small experiment with long-delayed course assessments, surveys that ask students to reflect on the classes that they have taken a year or two or three earlier.

I've been considering such evaluations ever since I went through the tenure a second time: the first was at a liberal arts college, the second two years later when I moved to a research university. Both institutions valued teaching but took markedly different approaches to student course evaluations. The research university relied almost exclusively on the summary scores of bubble-sheet course evaluations, while the liberal arts college didn't even allow candidates to include end-of-semester forms in tenure files. Instead they contacted former students, including alumni, and asked them to write letters.

In my post-tenure debriefing at the liberal arts college, the provost shared excerpts from the letters. Some sounded similar to comments I would typically see in my end-of-semester course evaluations; others, especially those by alumni, resonated more deeply. They let me know what in my assignments and teaching had staying power.

But how to get that kind of longitudinal feedback at a big, public university?

My first try has been a brief online survey sent to a selection of my former students. Using SurveyMonkey, I cooked up a six-item questionnaire. I'm only mildly tech-savvy and this was my first time creating an online survey, but the software escorted me through the process quickly and easily. I finished in half an hour.

Using my university's online student administration system, I downloaded two course rosters-one from a year ago, one from three years ago. I copied the e-mail address columns and pasted them into the survey. Eight clicks of the mouse later I was ready to send.

I sent the invitation to two sections of a small freshman honors English seminar I teach every other year. This course meets the first-year composition requirement and I teach it with a focus on the ways that writing can work as social action, both inside and outside the academy. During the first half of the semester students engage with a range of reading -- studies of literacy, theories of social change, articles from scholarly journals in composition studies, short stories and poems keyed to questions of social justice, essays from Harpers and The New York Times Magazine, papers written by my former students -- and they write four essays, all revised across drafts. During the latter part of the semester students work in teams on service-learning projects, first researching their local community partner organizations and then doing writing projects that I have worked out in advance of the semester with those organizations.

I taught the course pretty much the same in fall 2008 as I did in fall 2006, except that in 2008 I introduced a portfolio approach to assessment that deferred much of the final paper grading until the end of the course.

Through my online survey I wanted to know what stuck -- which readings (if any) continued to rattle around in their heads, whether all the drafting and revising we did proved relevant (or not) to their writing in other courses, and how the service experience shaped (or didn't) any future community engagement.

My small sample size -- only 28 (originally 30, but 2 students from the original rosters had left or graduated) -- certainly would not pass muster with the psychometricians. But the yield of 18 completed surveys, a response rate of over 60 percent, was encouraging.

I kept the survey short-just six questions -- and promised students that it would take five to ten minutes of their winter break and that their identities would be kept anonymous.

The first item asked them to signal when they had taken the course, in 2006 or 2008. The next two were open-ended: "Have any particular readings, concepts, experiences, etc. from Honors English 1 stayed with you? If so, which ones? Are there any ways that the course shaped how you think and/or write? If so, how?" and "Given your classwork and experiences since taking Honors English 1, what do you wish would have been covered in that course but wasn't?" These were followed by two multiple-choice questions: one about their involvement in community outreach (I wanted to get a rough sense of whether the service-learning component of the course had or hadn't influenced future community engagement); and another that queried whether they would recommend the course to an incoming student. I concluded with an open invitation to comment.

As might be expected from a small, interactive honors seminar, most who responded had favorable memories of the course. But more interesting to me were the specifics: they singled out particular books, stories, and assignments. Several of those I was planning to keep in the course anyway, a few of those I was considering replacing (each semester I fiddle with my reading list). The student comments rescued a few of those.

I also attend to what was not said. The readings and assignments that none of the 18 mentioned will be my prime candidates for cutting from the syllabus.

Without prompting, a few students from the 2008 section singled out the portfolio system as encouraging them to take risks in their writing, which affirms that approach. Students from both sections mentioned the value of the collaborative writing assignments (I'm always struggling with the proportion of individual versus collaborative assignments). Several surprised me by wishing that we had spent more time on prose style.

I also learned that while more than half of the respondents continued to be involved in some kind of community outreach (not a big surprise because they had self-selected a service-learning course), only one continued to work with the same community partner from the course. That suggested that I need to be more deliberate about encouraging such continuity.

In all, the responses didn't trigger a seismic shift in how I'll next teach the course, but they did help me revise with greater confidence and tinker with greater precision.

I am not suggesting that delayed online surveys should replace the traditional captive-audience, end-of-semester evaluations. Delayed surveys likely undercount students who are unmotivated or who had a bad experience in the course and miss entirely those who dropped or transferred out of the institution (and we need feedback from such students). Yet my small experiment suggests that time-tempered evaluations are worth the hour it takes to create and administer the survey.

Next January, another round, and this time with larger, non-honors courses.

Author/s: 
Tom Deans
Author's email: 
info@insidehighered.com

Tom Deans is associate professor of English at the University of Connecticut.

Survival of the Disciplines

First they came for the religious studies scholars and the geologists, and I posted comments on a couple of blogs. Then they came for the film studies people and the comparative littérateurs, and I briefly considered joining a Facebook group in protest. Then they came for the paleographers and computational linguists, and I signed a petition. Hold on, let me see who's banging on the door at this hour...

As long as I've been paddling around in academia -- i.e., since my father got his master's degree -- tenure has been the flagpole on which academic freedom has flown. It was all about protecting individuals from the pressures that the status quo puts on forward-thinking research.

Underlying that approach is the assumption that all areas of study are important, although individual arguments and conclusions may not be. But the recent developments at the University of Florida, the University of Iowa, King's College London, Washington State University, USC, and a host of other institutions reflect a new model of limiting academic inquiry, one that sidesteps the protections of tenure altogether.

The script seems to be the same everywhere. Go after the whole discipline, making sure to pay unctuous lip service to its importance and excellence. Make the point that ITTET (In These Tough Economic Times), colleges now have to be selective about what fields they can (read: deign to) support. Throw gobbets of meat to the angry students. Dodge the faculty as much as possible, and when you can't, turn them against each other by insisting that some program will have to go, and who would they load into the tumbrels instead? Use the word “painful” in every sentence.

I have no issue per se with specialization; most institutions can't have a program in every possible discipline. But over and over again, we're seeing an emphasis on STEM fields -- science, technology, engineering, and mathematics. The more you read about the STEM initiative, the scarier it gets. "STEM is the indicator of a healthy society"; "STEM is the key to future success"; STEM is the only thing that will keep us from living in refrigerator boxes under the freeway and eating our young. And just look at the signatories: Those are the institutions that have committed to prioritizing the sciences over the humanities and the social sciences. Goodbye, liberal arts – it’s been fun, but now it’s time to get serious.

You will already have noticed that S, T, E, and M are not just the fields that bring in the money but also the fields that prefer to assign as small a role to interpretation as possible. Of course scientific data require human interpretation, but all the STEM-mers I know believe that their fields deal in right and wrong answers. My colleague in the math department informs me that her discipline involves no interpretation whatsoever. And this is just as it should be; the natural world can refute hypotheses with tremendous clarity (see under: phlogiston; blood-letting; group selection).

But right and wrong answers occupy only one side of the academic quad. And this axing of whole fields closely resembles an attack on the humanities and social sciences -- in other words, the interpretive studies. It's not a concerted attack (complex conspiracies almost never succeed), but the effect is the same: promoting black-and-white disciplines and demoting unresolvable ambiguity to the realm of the hobbyist.

The effect on literary studies seems pretty obvious to me. Criticism will disappear quickly, and we'll return to the era of Appreciation. (Can you tell that I've just been teaching my theory students about 19th-century lit crit, to show them what the formalists were reacting to?) That's not a bad thing, except that aesthetic appreciation is generally (I'm inclined to say "necessarily," but I'm not sure I can defend that claim) a very effective means of shushing minority/subaltern groups and reinforcing the dominant ideology. The D.I. sets up opaque standards of appreciation and then measures everything by them -- and anything representing a different ideology (and standard of appreciation) is dismissed. That's exactly what happened to computational linguistics at King's College London.

I'm not sure where this leaves us, aside from up the creek. Perhaps subaltern studies is the last barricade against this broadscale attack on whole classes of disciplines. After all, the subaltern is mad as hell and not going to take it any more. So too might be the medievalists, the linguists, and the rural sociologists, but we don't know jack about organizing and making our voices heard.

The English Department could be the one to turn out the lights when we go. They keep us around because they value something they call "clear writing," and they think that whatever our silly little research is about, at least we teach writing (so they don't have to). Little do they know that we also teach the careful manipulation of metaphor -- better known as propaganda and marketing. But we're obviously not practicing what we teach, or else the interpretive disciplines would be in better shape.

The same can be said for Political Science, to choose just one example in the social sciences. A physicist friend points out with some bitterness that STEM has already come up with a set of solutions (her word, not mine) for global warming. Implementing them is the problem, she notes, and that is a job for the humanities and social sciences. If we gut those areas, every problem is left half-solved.

The thought of a world without Criticism -- a culture where any problem requiring interpretation is either ignored or recast as one with a single right answer -- isn't pretty. All those claims made for STEM fields (healthy society, future success, blah blah blah) are every bit as true for the interpretive studies. I agree that we could all do with more knowledge of S, T, E, & M (I pressure all my advisees to take statistics, for a start), but a society that sees every question in terms of black and white isn't going far. At least not in an upward direction.

Author/s: 
Meg Worley
Author's email: 
info@insidehighered.com

Meg Worley is an assistant professor of English at Pomona College.

Put Out to Pasture

I want to believe that when I was taking my favorite professors’ classes, those great men and women were at their peak.

I was a little disconcerted, then, when an older friend recently told me about how good my hero and mentor, the critic Marvin Mudrick, had been 20 years before I had taken him. “But … but I was there at the end,” I whined to myself. For eight years (until he died in 1986), as an undergraduate and graduate student at the University of California at Santa Barbara, I took his classes or sat in on them.

Even so, I knew there were quarters and classes during that time where he was better than others. But I wanted him to have been at his best during those years and I guess he fed into that conceit himself. He would make fun of some of his own old views about books, writers, and teaching — so I believed I was taking him at his peak. He seemed to think he was at his peak.

This past week, with my 11-year-old daughter sitting in on a few of my classes during her school break, I was perhaps at my worst. She was looking at me with expectation, attentively — an encouraging, demanding student. I watched my language and I hoped the students would watch theirs, not that she hasn’t heard everything. And then the next day she was supposed to come again to my classes, but she stayed back to hang out with another professor’s daughter at a campus closer to home, and I was free, and I was in the classroom, happy to be free, aware, by this point in the semester, how far I could push the students and hoping, in a couple of them, to keep them engaged. I was funnier than I had been in a long while — telling tangential stories that then led into better conversations than we would’ve got.

“We tell you everything — what about you?” teased one student.

And that day I was not old Bob — that is, paternal, avuncular Bob — I was young Bob, the one I’ve been missing, and I was willing to tell them things about my life that I wouldn’t have told them if my daughter had been in the room.

I was younger without my daughter than with her — I was free again, and teaching like that, by my wits rather than by my deliberate, this-is-for-your-own-good friendliness and deadpan, I was better for a day.

I was better.

I’ve told a few young teachers this and a few young adjunct professors, that I was happier teaching as an adjunct than as a full-time professor. It’s something like the difference I felt as a student writing for one professor over another, or occasionally now, by writing for one publication instead of another. In one I’m loose, myself, giddy, and in the other I’m responsible and sober. I’m better unsober. I don’t drink alcohol, but I’m better and smarter when I’m funny, when my funniness loosens up the class and makes my developmental students, so very self-conscious, so very cautious, lean out a little for a look, go out on a limb.

Don’t things change for us as teachers? Don’t we have to deal with that damned aging in a way that our friends in non-teaching professions don’t? We get older, but the students stay the same age. Mr. Mudrick used to tell us, his students, that he talked about love and sex a lot in the classroom because it was the only thing we all shared an interest in. I’m not so daring or funny as he was, so I don’t go very far that way, except … sometimes.

Every semester I can still get the recent immigrants and 18-year-olds hot under the collar about William Carlos Williams’s “The Knife of the Times,” a story he wrote in the early 1930s about a potential affair between two women, who were girlhood friends, and are now middle-aged and married with families. Some of my students unashamedly express prejudices about homosexuality, but outrage as well about such affairs, and yet there are not one in five intact parental marriages in the room.

Fictional characters are somehow supposed to behave! Better than real people! Most of my non-literary students hate conflicted people, people struggling to make romantic decisions that will cripple them. Political decisions, social decisions, those are too easy, in my opinion. But dare to tell someone you love that you love her? In my experience that’s the biggest drama. Call me a Jane Austenite. But also call me old.

Aging athletes, like aging professors, also like to say that they’re better now than they used to be. But people who really pay attention know that sometimes the young superstar is best when young; that he doesn’t just get better and better as some artists do; he hits his physical peak, and, lacking steroids — are there steroids for artists or professors? — he deteriorates and becomes a coach.

As I’ve proceeded as a teacher of developmental English I’ve become, to my thinking, more like a coach, an encourager, a butt-slapper (but because we’re not on a field, I do so only metaphorically). But I was better, I think, as a young professor, someone in-between, as someone questioning what we as a classroom of friendly
strangers are doing, as the guide who occasionally stops and wonders out loud, “Where are we going and why?”

No, I’m older and so aware of time passing that I get as anxious as a sheepdog and herd them along.

Mr. Mudrick may have stayed younger by getting himself more and more aware of the constraints on him as provost of a large college program and professor. That is, he deliberately wouldn’t let himself hold back. Because he was my unorthodox model, perhaps it’s inevitable that I slide rather toward more conventionality than further away. And perhaps this has come to mind because last week I started listening again to old tape-recordings of his classes.

He was so much himself in those classes, so happy to be there, so interested in us, in our reactions to what we read, in our reactions to what he provokingly and amusingly said, that those hundreds of hours in his classes continue to make me happy. But having had those stirring experiences, on my teaching days where my students and I have slogged through something that my college or department or my own old-age practicality has decided is necessary, I despair!

And then I have a good day, and I remind me of my old self, and I know I’ve lost something.

But like an athlete on the long decline, I stick around because I really still do like the game. I grimace when I miss a good teaching moment! Like a batter missing a fat pitch, I wince, “Oh, I should’ve nailed that!” A while ago, back in the day, I would’ve! So I’m slower, more watchful and deliberate, and because I can’t afford to miss as much as I used to, I’ve become more likely to take advantage of the little things that come up and go my way.

Author/s: 
Bob Blaisdell
Author's email: 
info@insidehighered.com

Bob Blaisdell is a professor of English at City University of New York’s Kingsborough Community College.

Going Nodal

A year ago in October, on a Saturday morning when the sun would not show its face, a group of about 30 faculty members sat around tables in a classroom that looks out on a restored prairie. The view from this window was already interdisciplinary; this piece of land not only serves as a site for scientific research, but is also presided over by the austere profile of a limestone cairn designed by British artist Andy Goldsworthy.

Helped by a grant from the Howard Hughes Medical Institute, we came together to talk about nodes. It’s not often that language in a grant proposal captures the imagination of a campus, but this has happened with the idea of nodes. Several of our faculty members in the sciences — led by a chemist, Mark Levandoski — came up with the idea. A node is a term used in more than one field: words like boundary, equilibrium, scale, transfer, model, energy, preservation. To learn what it means in other contexts might enhance the ability to understand and explain the concept in one’s own discipline. Hence the quest to identify such concepts — or nodes — in our undergraduate curriculum, and to discover how we can teach them more effectively.

The aim is not to develop a list of must-have concepts in the sciences. Some years ago our curriculum shifted to a focus on investigative skills and processes that largely replaced coverage of specified content. Instead of making a list, we want to discover where these intersections are occurring, and capitalize on them to help students learn. In the first phase of the grant, we took advantage of time freed to enroll in each other’s classes, the better to learn what students are hearing from our colleagues. Ultimately, the plan is to draw attention to the nodes and be clear with students about complementary perspectives across disciplines. As a result of examining nodes, interdisciplinarity — the relationships between disciplines and how each constructs knowledge — would become part of what we teach, even at the introductory level.

At the Saturday retreat on the prairie, some of the initial goals were already shifting. For one thing, there was no way we could limit this idea to the sciences. At least one economist, a philosopher, and a librarian had been invited, and some of the liveliest discussion arose at their tables. At a college where every faculty member teaches the required first-year tutorial, in a campus climate that invites exploration of new technologies and proposals for team-taught seminars, we share the territory.

A biologist declared, startling the economist and physicists, “To us, equilibrium is death!” Another biologist became restless as the librarian at her table extolled the node of preservation. She thought of dusty books, and wondered what she, a molecular geneticist, could do with this node. Suddenly it came to her. Fundamental to her work is the paradox that the material of biological inheritance must resist change in order to preserve hereditary information, while also being open to change in response to new environmental and evolutionary challenges. “I can’t help but think,” she reported after the session, “that a longer, deeper discussion with a group of non-biologists about preservation would freshen the way that I think about this idea and the way that I teach it. It turns out that this concept comes up in every biology course I teach.”

The models node has already provided a basis for early efforts at coordination between our intermediate-level biology and chemistry courses. After taking a summer workshop supported by the HHMI grant, chemistry professor Steven Sieck and biology professor Shannon Hinsa-Leasure developed a plan to present students with the models of penicillin used in their two fields. On the first day of class, Steve led his students through an outline of this molecule’s synthesis, which includes about 20 different chemical reactions. Toward the end, Steve again presented the same synthesis, highlighting the fact that most of these reactions had been covered in the course. Meanwhile, students co-enrolled in Shannon’s Biology 251 studied the mechanism of action for this same molecule — how the drug inhibits the ability of bacteria to synthesize cell walls. And in both classes, students were encouraged to go and see penicillin represented in works of art featured in "Molecules That Matter" on exhibit at the college’s Faulconer Gallery.

As a dean trained in literature and writing, I recognize that nodes have been around for a long time, and that another word for them is metaphors. An influential book by George Lakoff and Mark Johnson, Metaphors We Live By (1980), asserts that all conceptual thinking relies on metaphor.

In the spring, invited to lunch with a visiting group of statisticians, I performed a small test. I asked them what they thought about the word ambiguity. They recoiled. Ambiguity is bad. It confounds data and must be expunged from survey questions. What about in my field? I let them in on the fact that literary critics find ambiguity fascinating. How else could we examine the same novel or poem for centuries, without agreeing on — or even wanting — a final, definitive account of its meaning? They began talking among themselves again, about ambiguity. Maybe it was a richer concept in their field, too, than they had realized. I sat back, relieved. I had wondered what I could talk about for a whole lunch meeting, alone in a room of statisticians, a dean from the English department who had never taken a statistics class. But there would be more than enough to fill the hour. We had just begun to explore a node.

Author/s: 
Paula V. Smith
Author's email: 
info@insidehighered.com

Paula V. Smith is dean of the college and vice president for academic affairs at Grinnell College.

Pride in One's Work

Throughout my 31 years in higher education, from assistant to full professor at three universities — Oklahoma State, Ohio and Iowa State — I cannot recall doing anything that produced a lingering feeling of pride.

I’m not talking about ego-related pride in a promotion or an award; you outgrow those as years pass. No. I’m talking about an act so challenging that you doubted that you could perform it but undertook it anyway as a test of character or acumen.

As an ethicist, I know that pride is a deadly sin — the deadliest, in fact, and "sin of sins" of the seven — responsible for the fall of Lucifer from heaven (and many an assistant professor from the Ivory Tower).

The pride of which I speak has certain characteristics. It is done for internal rather than external reasons, often as a barometer of validity, and requires:

  • A test of one’s talents, knowledge, research or skill beyond what is routinely achievable.
  • The witnessing of that test by others so that the specter of public failure exists.
  • Courage to go through with the test in spite of feelings of dread or potential embarrassment.

To be honest, I never have taken much pride in my work as a teacher, researcher and administrator. That’s not a boast; it’s a treadmill fact. I dislike networking socially with former students because current ones need my time and attention. By the time my research is published, I’m doing other experiments that may end up refuting former hypotheses.

I'm sharing my sense of pride today not to celebrate myself but to remind you that renewal is essential with the academy in recession. The institution will take from you without acknowledgment or reward. Over time, that may cause you to question or doubt your validity and worth.

In fairness, though, educators who test students semester after semester may neglect to test themselves on the very principles and practices they embrace in the classroom or conference room. Theory is one thing; applying it in real life, another.

I purposefully did not mention my specific challenge because I didn’t want you to dismiss my experience as journalistic. You can conjure challenges in any discipline, the range of which will vary person to person and pedagogy to pedagogy. Neither should you do anything risky that can potentially harm you or your career and then litigate because Inside Higher Ed incited you with this article. You’re an adult. Do what you will within reason and accept responsibility and consequences — that’s part of the challenge, anyway.

Here was mine: I and colleague Dennis Chamberlin, a Pulitzer Prize-winning journalist, left our posts as journalism educators and worked for a week as writer and photographer for The Des Moines Register, seeing if Watergate-era reporters could succeed in the digital newsroom after a decades-long hiatus.

Before we began, we got official sponsors for our blog, including the social network NewsTrust.net, the Washington Post’s Writer’s Group and the Association for Education in Journalism and Mass Communication. Our plan was to post daily for one week before, during and after our Register gig.

Because we grade students, we asked that the managing editor, Randy Brubaker, grade us. Did I mention that our 750 undergraduate students and 35 teachers and staff and untold alumni and donors were following our blog via shared link and RSS feed on our school’s home page?

You might wonder what prompted two educators, secure in their careers, to take that risk before such an audience, knowing the political impact could be huge, especially on Internet. For instance, I have written widely and skeptically about consumer technology. I would be tweeting and blogging in my Register experience. Moreover, in addition to our constituents at Iowa State, other blogs would be following us as we tested the unpopular hypothesis that education and industry put too high a value on new technology and too low a premium on principles.

Chamberlin and I worked in media during a highly technological period — the switch from typewriters to computers — and so understood how complicated computing was in the DOS Age of the 1970s and 80s. Today’s technology, we felt, does everything for and about you, announcing upcoming appointments, providing driving instructions for interviews, and taking dictation or photos on demand and on site.

You can read about our experiment at “My Register Experience," which begins with comparisons of technology then and now. By the end of our journey, we were among the first in media to report on “The New Poverty,” about the Middle Class that paid bills and taxes and who suddenly found themselves at homeless shelters or in need of food and medical care in a state known for both. We also wrote and shot in narrative style — with beginning, middle and end (rare in today’s reportage) — and interviewed on the street rather than in the suite, finding the unemployed at a public lake rather than at the Unemployment Office, based on the notion that Iowans with their strong work ethic needed something to do in the morning.

We also used intuition more than Global Positioning Satellite software to track the depth of the recession. Doing so we broke a big story along with one journalism principle. Bob Woodward and Carl Bernstein were known for using anonymous sources. We did, too, knowing the practice eventually became taboo (for purists, anyway) because it undermined credibility.

At that lake, we encountered an unemployed nurse who at first identified herself and then later asked for anonymity. Her personal plight was dramatic, and her reason for not being named very legitimate — it might hurt in her job search. So we honored her request.

Shortly after our story appeared in the Nov. 23, 2009, edition of the Register, Buffy Renee Lucas, an unemployed nurse with similar demographics, drove her SUV into the lake at the same shore where Chamberlin and I conducted interviews, killing herself in the accident. Of course it was assumed this was the very same nurse that we had interviewed for our story, but it wasn’t. We used the blog to clarify that after publication, appreciating the instant publication of the Internet.

We tweeted when we had an update to our story, and that integrated well with our blog, driving audience to the blog and the blog, ultimately, to the print product on its run day.

In the end we received a passing grade from the managing editor.

Much good came out of our report, with food banks replenished and even television network follow-up, culminating in free psychiatric workshops on the untreated effects of persistent unemployment.

I returned to my journalism school with new appreciation for the work that modern-day reporters do in the digital newsroom, producing content on demand. Also, our methods inspired younger journalists who wanted to practice street reporting. We learned from them the value of digital devices in meeting deadlines every login.

Months later, something deeper than pride occurred within me: validation. As a reporter and bureau manager, I had witnessed firsthand the trauma and sorrow of spot news working for United Press International. I covered serial killings, prison riots, natural disasters and uprisings on Native American reservations. Because of that, I left the newsroom for the classroom. Returning to the newsroom involved courage more than the specter of public failure; it required inner strength to silence the demons of hard news past.

As such, I remain indebted to the Register for trusting Chamberlin and me to work a week without preparation and to file a human interest story that resulted in some good and that disclosed some bad, including escalating suicide rates that correlated in part with recession.

I look back at recent awards, promotions and even published scholarship with little sense of pride. I am paid to do that. But not this, which was a statement — or maybe a punctuation point — in my career, knowing that my principles still had value and that I, as an educator, was genuine in conveying them to students and sharing them with colleagues, as I am doing now.

In closing, I encourage you to share in the comments section below or even in a submission to Inside Higher Ed how you may have tested your own talents, knowledge, research or skill beyond what you knew you could achieve, requiring courage in the wake of dread or embarrassment.

We need to hear courageous stories that inspire others in this lingering recession, as we face budget cuts and larger workloads or even furloughs, firings and program elimination. If ever there was a need for uplifting stories, it is now, reminding us why we dedicated our lives to higher education and taking pride in our work, whether or not others appreciate or even acknowledge it.

Author/s: 
Michael Bugeja
Author's email: 
info@insidehighered.com

Michael Bugeja, director of the Greenlee School of Journalism and Communication at Iowa State University, is author of Living Ethics Across Media Platforms and Interpersonal Divide: The Search for Community in a Technological Age.

New Digital Tools

Novelty is not, as such, a value to me. One look at my wardrobe will confirm this. But when it comes to assessing new digital tools, being resolutely un-with-it may have certain advantages. I am slow to enthusiasm, and keep well away from the cutting edge, for fear of falling off. All that really counts, to my taste, is usefulness – though simplicity has a definite appeal.

With this week’s column, I want to recommend two such tools. They are free and easy to use. And without indulging in tech boosterism, it seems fair to say that they will improve the time you spend online.

Elegance and efficiency are the defining qualities of Readability. The very name is a case in point – it tells you exactly what you are getting.

With the press of a button, Readability transform a page from any Web site – however cluttered or eyestrain-inducing – into something clean and legible. It also puts the text in large print. Although I am sufficiently far-sighted to need reading glasses, I don’t need them when using Readability.

But even a person with 20-20 vision in each eye might find Readability appealing for its aesthetic impact. It wipes out all the distractions (sidebars, ads, comments, and most graphic elements) and leaves you with pure, unadorned text.

Someone with no technological aptitude can install Readability in about five seconds. The learning curve for its use takes not much longer than that. It works in the major browsers: Internet Explorer, Firefox, and Safari. Once installed, it will create either a button in your browser’s toolbar or an “R” icon in the browser’s lower right-hand corner (what people with the lingo call its “system tray”).

When you find a Web page that you’d care to read as unadorned text, click on the Readability button . It promptly transforms the article (or blog post, or what have you) into a document that resembles a typescript in roughly 14- or 16-point characters. Graphics and photos embedded in the articles will remain, but everything else is stripped out.

To return to the original version of the article, either hit the browser’s “back” button or click the faint back arrow that floats in the upper left corner of the Readability screen. Another such button allows you to print the page as it appears in Readability.

Doing so has its advantages, ecological as well as optical. Printing the graphic elements on a Web page can waste a lot of toner.

It bears mentioning that Readability is an option and not a default setting. In other words, if you are looking at something in it, then go to another page, the new page will not automatically open in Readability. Not a big deal, of course. (You just click it back on.)

Unfortunately the Readability plug-in does nothing with a document in PDF. Also, it will sometimes remove the name of the author from an article -- depending, presumably, on whether it is incorporated into the text or not.

That is a pain. I’m not going to complain too much, though. Readability has already saved me plenty of eyestrain. More than a gizmo, it’s become something I’d hate to be without.

***

A little more time and experimentation are required to master Evernote, but it’s worth the time. It is an impressive and appealing tool, almost certain to help you get the most out of time spent doing research online.

As with Readability, I learned of it from my wife, who is a research librarian specializing in information technology. A few months ago, she began proselytizing for it with all the fervor of a Jehovah’s Witness in possession of the latest issue of The Watchtower.

Its virtues and capacities were, so one gathered, both various and mind-boggling, though this inspired in me no undue haste to convert. (I am, remember, a man wearing t-shirts manufactured before many of today’s undergraduates were born.) But having come to recognize the sheer power of Evernote, I am now prepared to spread the good word.

It is something like a hybrid between a notebook and a filing cabinet. That’s the closest analogy that comes to mind. But it understates things quite a bit.

At its most basic level, the application allows you to take notes and organize them into files. You can attach labels to the resulting documents, and search them. But that is really just the tip of the iceberg. Evernote will also allow you to collect and store copies of web pages and articles you’ve found online, as well as PDFs, photographs, scanned documents, and audio files. You are able to add notes to those multimedia files, too, and to attach tags that will help you find them again.

An example: I am gathering ideas and references for a lecture on Bolshevik cultural policy. For the most part, this involves rereading things, but I notice almost by chance that someone is selling a portrait of Lunacharsky, the first Soviet commissar of arts and education, on eBay. A bit too expensive for my budget, alas. But thanks to Evernote I can grab the image and store it in the working file alongside quotations from his work. And I can attach a tag that will remind me to use it as one of the slides for my talk.

Evernote allows you to share any given file with other people – by making it available to invited guests or (through a URL) the whole world. And it has at least one feature that is like something out of a spy movie: via its optical character recognition feature, you can take a photograph of text and then use Evernote to search for the words in the photo.

While having dinner at a Chinese restaurant with my technology guru, I sat dumbfounded as she took a snapshot of the menu with her BlackBerry... loaded it into Evernote... searched for the word “dumpling,” which Evernote highlighted in yellow... then forwarded the resulting phototext by email.

You can use Evernote with your desktop computer, laptop, netbook, or cell phone. Or all of the above, really, depending on what is most convenient at any given time – for you can have your files stored at Evernote.com, They are in “the cloud,” to use an expression proving we now dwell in a science-fiction landscape.

The free version of Evernote is available for downloading here. There is also a premium version costing $50 per year that I have not used. Among other things, it gives you more room for your files, and allows you to save documents in other formats, including Word. (The free version provides generous but not unlimited storage capacity.)

Evernote has some similarities to Zotero, though it gives you control over a wider variety of materials. On the other hand, Zotero is designed for scholarly use and has the capacity to locate and “grab” bibliographical data from library catalog records, while Evernote does not. (You can store such information using Evernote, of course, but Zotero is more efficient about it and knows how to export the data in various standard citation formats.) Each is a valuable research tool, and with time I will probably figure out a way to move between them.

The Web site for Evernote will give you some idea how to use it, and you can figure a lot out with a period of trial and error. But it might be worthwhile to seek out a little training. Your best bet might be to ask for help at your library, which is staffed by information-science wizards with amazing powers.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Life
Back to Top