Thumbnail-horizontal

Return to Earth College

I’m not much one for reunions at my alma mater. But I did have a 25th reunion last month at one of my journalistic alma maters, so to speak, College of the Atlantic, the small, environmentally oriented, alternative liberal arts college located off the coast of Maine. It was one of the colleges I covered during my first tour of duty as a freelance education writer during the late 1970s and early 1980s.

Like most of the stories I did during my early, gallivanting days, the one I did about COA began with a hunch. The little information I had about this remote, decade-old, solar-powered cousin of Bennington, Goddard, et al., was that COA offered a bachelor of arts degree in something called human ecology, and that staff and students spent a lot of time observing and tracking whales. I was intrigued. 

And so, armed with an assignment, off I flew to Bar Harbor, Maine, for what turned out to be one of my most memorable assignments covering academe. I was immediately taken with the college’s Noah-like president, Ed Kaelber, and his vice president, Sam Eliot, whose environmentalist passion was leavened by a self-deprecatory sense of humor.

What moved COA’s founders to establish their college-cum-environmentalist colony back in 69?, I asked Eliot one blustery evening, as we huddled over coffee in his office in the college’s Ark-like wooden administration building. "Basically, we came out here to save the world," Eliot said.  “Now,” he said with a grin, “we’re concentrating on Maine.”

And saving Maine the earnest eco-missionaries of COA were, via such inspired stratagems as a dead minke whale that had washed up near the college and had been converted into a mobile mammalian biology diorama for the benefit of the local populace. Whale on Wheels, it was called. COA students were largely responsible for preserving Maine’s Great Heath, an ecologically unique bog. The college’s Harbor Seal Project had helped rescue many abandoned or stranded seals.  And the Department of Interior thought highly enough of the biologist Steve Katona’s course, Whales of the North Atlantic, to award his class a contract for the Mount Desert Island Whale Watch.  With 180 students and 15 faculty members, classes at the spare, island-based campus were small, education an intense, hands-on affair.  I never saw a faculty as inspired and committed as COA’s.    

For the most part, classes at COA were as intellectually rigorous as anywhere, if not more so. Some people might have difficulty defining exactly what human ecology meant -- "it's … a seagull" said one misty-eyed student -- and yet COA students were making real connections between man and nature. Here, in December 1980, as the new materialistic morning of Ronald Reagan was dawning, was a college really dedicated to changing and, yes, saving the world.

To a sixties survivor that was bracing to behold. "If the deterioration of the environment keeps going the way it is now," in the prescient words of Glen Berkowitz, one of the many dynamic, clear-eyed students I met during my fascinating sojourn in Bar Harbor, "people will have to use COA graduates." He was right. (In fact, Berkowitz, who graduated in 1982, went on to become a senior consultant with Boston’s massive Big Dig project, where he advised the builders on the human impact of the dig, and is now involved with a wind power project for the city’s harbor.) He's but one of the many COA graduates who have used their unique education to do social and environmental good. Others include Chellie Pingree, head of Common Cause and Bill McLellan, a University of North Carolina research scientist who National Public Radio recently described as the federal government’s “go-to guy on marine mammal research.”  

I had planned on a visit of several days. Instead I wound up staying for several weeks. My subsequent dispatch about “Earth College,” as I good naturedly dubbed the place, reflected my affection for the spunky laboratory school. "To be sure, the college needs a gymnasium and a student center," I reported. "But the College of the Atlantic is alive and well. That in itself is something to celebrate." 

Privately, I wasn’t so optimistic. The future for alternative or experimental colleges, I well knew, was increasingly grim, having recently reported the demise of one of COA’s experimental siblings, Eisenhower College, whose lofty minded World Studies program and holistic educational philosophy was not unlike COA’s.

Hence my delight and surprise, upon recently visiting the college on the Web, to encounter an institution that, at least on the evidence of its kaleidoscopic site, was thriving.  But Web sites can be deceiving. It was time to check out College of the Atlantic again.  

And so, last month, just as I had a quarter of a century before, I set off for the college’s rustic, coastal Maine campus, next to Acadia National Park. Once again I found myself auditing classes, hanging out with COA students and faculty in the main dining room, listening to the swooning sea gulls, just as I did long ago.

My green reunion. Best reunion I ever had.

To be sure, I learned from some of the veteran COA faculty I met up with again, COA did wind up having its own Sturm und Drang period in the early 80s, including a civil war pitting faculty and staff who wished to keep the college as a college against another faction that wanted COA to become more of a think tank. The former won. However, enrollment at the beleaguered campus dropped to a mere hundred. "We almost lost the college," one teacher said.

Nevertheless, under the leadership of Steve Katona, the college’s savvy whale-watcher-turned president, who has been at the college’s helm for since 1992, COA has survived. Now, with an enrollment of 270 students -- over 20 percent of them from abroad -- and 26 faculty, COA is, indeed, thriving. Shedding the "experimental" label that once put off parents of prospective students, the pioneering institution is competitive with some of the best mainstream liberal arts colleges in the country, while the human ecology concept and educational philosophy that COA pioneered has gained respect.     

On the surface, COA is no longer as "crazy" as it once was. The college has an eye-catching logo now, and an expensive viewbook. The food is no longer strictly vegetarian. COA’s ponytail is gone.

And yet, I could see, in the small, intensely participatory classes and laboratories I audited, and the interactions I had with students and faculty, that the college’s essence and mission is unchanged. Here, still, on this remote island, off the coast of Maine, is a community unabashedly committed to saving the world.  

One professor, Davis Taylor, is an economist and former Army captain who attended West Point. He said that while at first blush one could hardly think of two institutions more different than West Point and COA, he saw similarities between the two. "Both have a sense of mission," Taylor said, and “both emphasize systems thinking.”

As one student after another, including ones from as far away as Serbia and Seattle, told me, “I came here to make a difference.”

In the best sense, I could see, during the rainy but otherwise mind-and-spirit expanding week I spent in Bar Harbor. It was clear in a horizon-busting class in environmental history, or an impromptu world music session in the college greenhouse. College of the Atlantic is still alive and crazy after all these years.  And, for one of its early champions, and as one who believes that the greatness of the American higher education system lies in its multiplicity, that was reassuring to see.  

I could also see that original spirit in a hands-on, feet-in conference in riverine planning that I (literally) waded into, where COA faculty, staff and local planners contributed to show journalists how it’s possible to affect a community planning system on an environmental and inter-county level.   

So there I was one stormy afternoon hanging out with Bill Carpenter, the novelist and poet who has taught at COA since its founding 36 years ago, sifting the college's saga over strong coffee in his cozy, book-lined office. We had returned from an exciting, syncopated session of “Turn of the Century,” an interdisciplinary class in cultural history that Carpenter teaches along with the artist JoAnne Carpenter and the biologist John Anderson, in which the three professors enthusiastically riff off each other, in between questions from the packed, palpably delighted class of 25 (which for COA is huge).

“So, what was your original vision?”  I asked Carpenter, as we reminisced about the college’s wild and woolly early days.

“This was our vision,” he said, with finality.   

Here’s to survivors.

Author's email: 
info@insidehighered.com

Gordon F. Sander, an Ithaca-based journalist and historian has written about higher education for The Times Higher Education Supplement, The Chronicle of Higher Education, The New York Times and many other publications.  He was recently artist-in-residence at Cornell University's Risley College for the Creative and Performing Arts. His most recent book is The Frank Family That Survived: a 20th Century Odyssey (Random House UK).

Real Knowledge

During the heyday of American economic and geographical expansion, in the late 19th century, the men who sold real estate occupied a distinct vocational niche. They were slightly less respectable than, say, riverboat gamblers -- but undoubtedly more so than pirates on the open seas. It was a good job for someone who didn’t mind leaving town quickly.

But about 100 years ago, something important began to happen, as Jeffrey M. Hornstein recounts in A Nation of Realtors: A Cultural History of the Twentieth Century American Middle Class, published this spring by Duke University Press.  Some of those engaged in the trade started to understand themselves as professionals.

They created local realty boards and introduced licensing as means by which reputable practitioners could distinguish themselves from grifters. And in time, they were well enough organized to lobby the federal government on housing policy –- favoring developments that encouraged the building of single-family units, rather than public housing. Their efforts, as Hornstein writes, "would effectively create a broad new white middle class haven in the suburbs, while leaving behind the upper class and the poor in cities increasingly polarized by race and wealth."

I picked up A Nation of Realtors expecting a mixture of social history and Glengarry Glen Ross. It's actually something different: a contribution to understanding how certain aspects of middle-class identity took shape -- both among the men (and later, increasingly, women) who identified themselves as Realtors and among their customers. Particularly interesting is the chapter "Applied Realology," which recounts the early efforts of a handful of academics to create a field of study that would then (in turn) bolster the profession’s claims to legitimacy and rigor.

Hornstein recently answered a series of questions about his book -- a brief shift of his attention back to scholarly concerns, since he is now organizing director of Service Employees International Union, Local 36, in Philadelphia.

Q:Before getting to your book, let me ask about your move from historical research to union organizing. What's the story behind that?

A: I was applying to graduate school in my senior year of college and my advisor told me that while he was sure I could handle grad school, he saw me as more of "a politician than a political scientist." I had always been involved in organizing people and was a campus leader. But I also enjoyed academic work, and went on to get two graduate degrees, one in political science from Penn, another in history from the University of Maryland.

While I was doing the history Ph.D. at Maryland, a group of teaching assistants got together and realized that we were an exploited group that could benefit from a union. Helping to form an organizing committee, affiliating with a national union, getting to know hard-boiled organizers (many of whom were also intellectuals), and attempting to persuade my peers that they needed to take control of their own working conditions through collective action captured my imagination and interest much more than research, writing, or teaching.  

After a long intellectual and personal journey, I finally defended my dissertation. The academic job market looked bleak, particularly as a graduate of a non-elite institution. And when I was honest with myself, I realized that my experience forming a graduate employee union engaged me far more than the intellectual work.

Armed with this insight, I put the diss in a box, and two weeks later, I was at the AFL-CIO’s Organizing Institute getting my first taste of what it would be like to organize workers as a vocation. In the dark barroom in the basement of the George Meany Center for Labor Studies, a recruiter from an SEIU local in Ohio approached me and asked me if I’d like to spend the next few years of my life living in Red Roof Inns, trying to help low-wage workers improve their lives. Two weeks later, I landed in Columbus, Ohio and I was soon hooked.  

And I would add this: The supply of talented and committed organizers is far outstripped by the demand. The labor movement’s current crisis is, frankly, a huge opportunity for energetic and idealistic people to make a real difference. Hard work and commitment is really rewarded in the labor movement, and one can move quickly into positions of responsibility. It’s very demanding and often frustrating work, but it’s about as fulfilling a vocation as I could imagine.

Q:You discuss the emergence of realtors as the rise of a new kind of social identity, "the business professional." But I'm left wondering about early local real-estate boards. They sound kind of like lodges or fraternal groups, as much as anything else. In what sense are they comparable to today's professional organizations, as opposed to, say, the Elks or the Jaycees?

A: Indeed, early boards were very much like fraternal organizations. They were all male and clubby, there was often a "board home" that offered a retreat space, and so on. Early real estate board newsletters are rife with the sorts of jokes about women and minorities that were standard fare in the 1910s and 1920s -- jokes that, I argue, help to police the boundaries of masculinity.  

In the early chapters of the book, I provide brief sketches of the workings of the Chicago and Philadelphia real estate boards, as well as a sort of anthropological view of early real estate conventions. My favorite was the 1915 Los Angeles convention, during which the main social event was a drag party. In my view, the conventions, the board meetings, the social events, the publications, all formed a homosocial space in which a particular sort of masculinity was performed, where the conventions of middle-class masculinity were established and reinforced.  

In the early 1920's, the emphasis began to shift from fraternalism to a more technocratic, professional modality.  Herbert Nelson took the helm at the National Association of Real Estate Boards in 1923, and he started to make NAREB look much more like a modern professional organization. In some respects he created the mold. He made long-term strategic plans, asserted the necessity for a permanent Realtor presence in Washington, D.C., pushed for standards for licensing, worked with Herbert Hoover’s Commerce Department to promulgate a standard zoning act, and linked up with Professor Richard T. Ely [of the University of Wisconsin at Madison] to help "scientize" the field.  

Nelson served as executive director of NAREB for over 30 years. During his tenure, the organization grew, differentiated, specialized, and became a powerful national political actor. In sum, it became a true modern professional association in most ways. Yet like most other professional organizations prior to the ascendancy of feminism and the major incursion of women into the professions, masculine clubbiness remained an important element in the organizational culture well into the 1970s.    

In sum, the story I tell about the complex interdependencies of class, gender, and work identities is largely about the Realtors’ attempts to transform an Elks-like organization into a modern, "professional" business association.

Q:On the one hand, they see what they are doing as a kind of applied social science -- also creating, as you put it, "a professional metanarrative." On the other hand, you note that Ely's Institute for Research in Land Economics was a casualty of the end of the real estate bubble. Doesn't that justify some cynicism about realtors' quest for academic legitimacy?

A: I don’t see the Realtors or the social scientists like Ely in cynical terms at all. In fact, both parties are quite earnest about what they’re doing, in my view. Ely was nothing if not a true believer in the socially transformative power of his research and of social scientific research in general. He managed to persuade a faction of influential Realtors, primarily large-scale developers ("community-builders") such as J.C. Nichols, that research was the key to professionalism, prosperity, and high-quality real estate development.  
Ely’s Institute was not a casualty of the implosion of the 1926 Florida real estate bubble as such. But the real estate collapse and the ensuing Depression made it much harder for the Realtors to make claims to authority based on disinterested science.

It’s not that the grounding of the whole field of Land Economics was problematic – at least no more so than any other field of social or human science, particularly one that produces knowledge that can be used for commercial purposes.  

The academic field was in its infancy in the 1910s and 1920s, and there were intra-disciplinary squabbles between the older, more historical economists like Ely and the younger generation, which was much more model- and mathematics-driven. At the same time, there were sharp divisions among Realtors between those who believed that professionalism required science (and licensing, and zoning, and so on) and those who rejected this idea.  

So, yes, the Elyian attempt at organizing the real estate industry on a purely ‘scientific’ basis, operating primarily in the interest of the social good, was largely a failure. However, the 1920s mark a watershed in that the National Association became a major producer and consumer of social scientific knowledge. Business schools began to offer real estate as a course of study. Textbooks, replete with charts and graphs and economic equations, proliferated. Prominent academics threw their lot in with the Realtors.

In the end, the industry established its own think tank, the Urban Land Institute, the motto of which is “Under All, The Land” -- taken straight from Ely’s work. But the profession itself remained divided over the value of ‘science’ – the community-builders generally supported efforts to scientize the field, while those on the more speculative end of the profession were generally opposed.  

But again, I don’t think that the grounding of the field of land economics is any more questionable than any other subfield of economics, such as finance or accounting.

Q:Your book left me with a sort of chicken-and-egg question. You connect the growth of the profession with certain cultural norms -- the tendency to define oneself as middle-class, the expectation of private home ownership, etc. Didn't those aspirations have really deep roots in American culture, which the Realtors simply appealed to as part of their own legitimization? Or were they more the result of lobbying, advertising, and other activities of the real-estate profession?

A: Absolutely, these tendencies have roots deep in American culture. The term "middle class" was not really used until the late 19th century -- "middling sorts" was the more prevalent term before then. The "classless society" has long been a trope in American culture, the idea that with hard work, perseverance, and a little luck, anyone can "make it" in America, that the boundaries between social positions are fluid, etc.  

But it’s not until the early-to-mid 20th century that homeownership and middle-class identity come to be conflated.  The "American Dream" is redefined from being about political freedom to being about homeownership. At around the same time, debt is redefined as "credit" and "equity."

So, yes, I ‘d agree to some extent that the Realtors tapped into longstanding cultural norms as part of their efforts at self-legitimization. Like most successful political actors, they harnessed cultural commonsense for their own ends – namely, to make homeownership integral to middle-class identity. Their political work enabled them, in the midst of the Depression, to get the National Housing Act passed as they wrote it -- with provisions that greatly privileged just the sort of single-family, suburban homes leading members of NAREB were intent on building.  

The Realtors used the cultural material at hand to make their interests seem to be the interests of the whole society. But, as we know from many fine studies of suburban development, many people and many competing visions of the American landscape were marginalized in the process.

Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Admissions: Worse Than Ever

Our younger child just finished the college admissions sweepstakes. He got into one of his top choice schools, but he says he feels more unburdened than proud. Now he can get on with his life, enjoying the things he loves to do. He no longer has to worry about marketing his “admissions package,” as if he were the latest toothpaste or laundry detergent. 

Our family last went through the admissions experience eight years ago when our older child applied to college. Although he ended up at one of the “hot” Ivy League universities, we sadly concluded that the selective college admissions process had no redeeming social value. You just lived through it, hoped your child survived unscathed, and prepared to hand over your bank account.  

Unfortunately, it has gotten worse since then. More than ever, higher education seems like a commodity, as selective colleges market themselves shamelessly, increase applicant demand, and manage enrollments as if they were commercial enterprises. And, in response, an industry of expensive services and consultants to teach applicants how to game the admissions system is booming. Uncalculated is the toll on students, integrity and fundamental fairness.

This time around, college planning started just before ninth grade, when the college counselor at our son’s school met with parents and students to advise on the importance of course selection over the next four years. The message was to take diverse and challenging courses if you hope to get into a selective college -- loosely defined as the top 50 colleges and universities in the U.S. News & World Report annual survey. No big deal: Anyone who is interested in a rigorous liberal arts education for their child would probably take this advice anyway.  

Then came 10th grade’s pre-pre-college admissions testing regimen: the PSAT, given by the College Board, and the PLAN, from ACT Inc.  This was to get students ready to take the same tests again in 11th grade, to get them ready to take the tests that count big time in college admissions, the SAT and ACT.  Although originally devised as alternatives, counselors now tell students to take both the SAT and the ACT and submit the score of the one they do best on. These tests are in addition to at least three SAT II  “achievement” tests and, of course, a battery of Advanced Placement exams for those rigorous courses they are counseled to take. Pile on top of these the now de rigueur SAT and ACT review courses -- at, not incidentally, anywhere from $700 to $3,000 a pop.  

Our son, a motivated student with top grades and a challenging academic program, is a very good, but not spectacular, standardized test-taker. Friends with children at other schools told us that kids had to have 1500 SAT’s to be in the admissions hunt at top-echelon colleges. Looking at the median test scores published by colleges and information services all over the Internet, this notion did not seem completely off-base. But even if it meant going to a lesser member of the “nifty 50” group of colleges, our son eschewed review courses on the grounds that he already had a heavy schedule and would rather read some good books than spend hours taking boring SAT or ACT prep classes. Obviously, we had done something right in his education, but we were definitely out of the mainstream.  

He opted not to take the SAT at all, and ended up scoring in the 99th percentile on the ACT after doing some test prep at home on his own. This he was proud of, because, as he said, he isn’t a wiz at standardized tests, and he didn’t take an expensive prep course. I suppose it was a kind of reverse snobbery (“anyone can do well if they take a prep course, but I did it on my own”) and a real sign of the times in the selective college admissions world.  

Fate was cruel to him in other ways. The night before the first AP exam in his junior year, he developed golf-ball-sized lymph nodes all over his neck and groin that looked suspiciously like lymphoma. It took four days to determine that he had mono, not cancer. This scare did put the whole college admissions lunacy in perspective for us.  

On the other hand, our son endured AP and SAT II exams while suffering from mono. Now he had a new dilemma. Does he tell colleges he took the exams while sick? Does he take tests over in the fall?  No matter how well he did, would he have done better if he had not had mono? In the end, he decided to accept fate. He did reasonably well on the tests, there were limits to how much of his life he was prepared to devote to getting into the “perfect” college, and he did not like making excuses, even good ones.

Our son’s college application experience was tame compared to children of a lot of upwardly mobile, well-educated, Baby Boom parents. For starters, the popularity of private “college consultants,” notwithstanding their ludicrous fees, took us by surprise. One family we know had a consultant on retainer from the time the child was in seventh grade. This was in addition to the cost of SAT prep courses and the professional editor for the college essay. The total bill for these services was more than $30,000.  

An acquaintance we bumped into at a wedding last summer informed us she had just opened a private college consulting business, having recently retired from her position as a highly successful college counselor at an elite prep school. She offers a four-year package for about $15,000, or the college-application-only option for the all-important senior year for about $5,000. Her phone was ringing off the hook. Could this possibly be worth the extraordinary expense?

More important, what message does it send to children about their worth and competence when we act as if the only way they can make it into a selective college is to hire high-priced help to package and market them? Is the admissions prize worth this psychological price? As bad, are we raising a generation of young cynics?

Looking for Help

A quick Internet search revealed no shortage of expensive, fear-mongering consultants to guide students and their families through what they imply is the mine field of selective college admissions.  After reading these sites, we wondered if a mere mortal could possibly fill out an application for an elite college, never mind actually get in. I went to Amazon.com and did a search for books on college admissions. The first book that turned up was A is for Admission; the Insider’s Guide to Getting into the Ivy League and Other Top Colleges (Warner Books, 1999), the controversial, tell-all exposé of selective college admissions by Michelle A. Hernandez. Hernandez is a former Ivy League admissions officer who now has -- you guessed it -- a college consulting business. I ordered the book and read it cover to cover.  

She confirmed what our older son had learned from an admissions office friend at his Ivy League university: You are lucky if an admissions reader devotes 15 minutes to the application your child labored over for months. It might even be more like 10 minutes. Hernandez also explained how, by calculating a so-called “academic index,” the selective college admissions office will reduce your child’s entire high school career to one number, weighted heavily in favor of standardized tests. The book had the ring of truth, not the least because it confirmed my by-now-cynical view of the selective college admissions process.  

Hernandez also instructed how to play the admissions game, with specific coaching like: play down economic advantages; play up work experience, especially hard manual labor; show long-term passion about a few things; choose teachers for recommendations who you know can write with style; and most importantly (was this tongue-in-cheek?) be yourself. Her follow-on volume, Acing the College  Application: How to Maximize Your Chances for Admission to the College of Your Choice, was prescriptive about how to fill out an application, including how to do the “brag sheet,” the list of activities and interests that is required in the Common Application  now used by most colleges.

Of course, her example of a brag sheet, taken from one of her clients, made the applicant sound like a combination of Albert Schweitzer and Steven Spielberg. If this was the competition, it was very discouraging. Her advice on college interviews was sensible and contained a list of common interview questions. (Spot on, according to our son, after having gone through six interviews.) You can retain Ms. Hernandez for what is undoubtedly thousands of dollars, or you can buy the books for a total of about $25. We chose the cheap alternative.  

One of the great eye-openers in the college admissions experience was the amount of disingenuousness involved in writing the college essay. Our son’s school spends a few weeks in English class early in the senior year working on crafting personal essays in order to prepare for college applications, so we naively assumed that students wrote their own college essays.

Not necessarily. As we spoke to parents in other places who had lived through the senior year with their children, we personally came to know of a father who wrote his daughter’s college essay, a father who had his son’s college essay written by an employee of the father’s business, and parents who hired professional editors or writers to “help” with the college essay. The worst part is that in every case, these children got into their first choice schools.  

We live in a small town in upstate New York and thought we were immune to what we viewed as these metro-area ethical challenges. Wrong again. The summer before our son’s senior year, we received a glossy brochure from a professional writer in our town. He has gone into the business of helping students to “find their voices” in the “all important” college essay, a service for which he charges the mere pittance of $1,500. Isn’t your child’s future worth it? There seems to be so much deception in college essay writing, I have come to the conclusion that essays should be eliminated from applications in favor of a personal essay question administered in a controlled environment by the College Board or  ACT and forwarded by them to colleges. Ironically, I never imagined I would find myself advocating for yet another college admissions test.  

The same family that spent more than $30,000 on college consultants claimed that the college counseling staff at their well-regarded country day school advised that if the family was of a charitable bent, the application year would be a good time to make a significant donation to their child’s first-choice college. The family said they pledged half a million. An old friend who has been on the faculty of an elite liberal arts college in New England for a quarter century confirmed that over the past five years it has become well known that a contribution of $500,000 to $1 million to a selective college can secure a spot in the class for a student who is academically qualified.

Since 90 percent of applicants to such colleges are academically qualified and most of them are not admitted, the wealthy who are prepared to be generous at the right time appear to be able to buy admission for their children. Off the record, some selective college administrators we know demur that you have to pledge to rebuild the library in order to influence an admissions decision. Whatever the price, the dirty little secret seems to be that admission is for sale in what sounds like a pretty straight-forward, if expensive, transaction.   

Toward the end of our son’s wait to hear from colleges, he had a nightmare that notification finally came but merely said, “No conclusion.” Did it mean he was consigned to college admissions purgatory forever? This was a fate worse than death. Happily, he awoke and was eventually admitted. Just as happily, we will never have to live through this experience again.

But we cannot help wondering if the selective college admissions process is losing integrity with every passing year. Reading thousands of applications at ten or fifteen minutes apiece, can admissions officers really see through anything but the most obvious and overblown applicant marketing? How can we believe their universal representation that each application is carefully reviewed? And what happens to families whose children go to schools with under-staffed and overburdened guidance offices and who cannot afford private college consultants, clever essay editors, test prep courses and mammoth charitable contributions?

These questions raise issues of fairness that go far beyond the current debates about affirmative action. Let’s hope the colleges are trying to answer them.  

Author's email: 
info@insidehighered.com

Deirdre Henderson is a mother and lawyer who lives in upstate New York.

Cheating in a Time of Extenuating Circumstances

Whatever happened to cheating? The question occurred to me the other day, when I turned on the television and found myself watching School Ties, a 1992 movie about a posh New England boarding school starring Brendan Fraser and Matt Damon. Damon cheats on an exam. Fraser sees him. When the teacher finds Damon's cheat sheet on the floor, he challenges the cheater to come forward, or else the class to bring him forward, according to the dictates of the school honor code.

Eventually, Damon is named and expelled. But he is identified by the head prefect, not Fraser. Both should have themselves come forward sooner. "The honor code is a living thing," declares the dean. "It cannot exist in a vacuum." Precisely, the trick of the movie is to provide a vacuum -- the time is 1955 -- and then let just enough of real-world air bubbles leak in so that, alas, we secretly wish that the vacuum -- leaving aside the anti-Semitism that contaminates the world portrayed in this particular movie -- could somehow have remained sealed.

What clarity back then! Damon does cheat. He knows what he did is cheating. There is no nonsense on his part about having made an "error of judgment," and no cant on anybody else's part about "extenuating circumstances." Everybody else recognizes what cheating is. Nobody has to ask. "Someone has robbed you of your honor," the teacher tells the class. "If I ignore it, you will rob me of mine as well."

Today, however, the very word, "cheating," sounds, well, crude, perhaps a bit antique, even irrelevant. I asked a friend of mine to tell me a cheating story. He immediately recalled a student of his who was getting an A. Come the last paper. It was plagiarized. My friend decided to give the student a final grade of B. "You plagiarized your final paper," he told the student when he came round to inquire about his grade next semester. The student just shrugged and walked away.

In the world of School Ties, the student who cheats has dishonored himself. In the world of grade inflation and Enron, though, the student has merely been caught. How to account for the difference? One might just as well try to explain the loss of the idea of "honor." Can it only function in highly circumscribed (perhaps ultimately military) contexts? As an operative value in normal academic circumstances, has "honor" now been as utterly undone by a  student culture of excuses, just as this culture has been thoroughly saturated by a larger American culture of victimization?
 
Hard questions. Let me try to concentrate instead on one feature: how the occasion for cheating has changed. In School Ties, this occasion is a test. In my friend's instance it is a text. At least some of the reason that cheating leads such a baffled existence in the academy at the present time is that a text is not a test. In each case, the standards by which to judge whether cheating has occurred in any one specific instance seem to be the same. They are not.

The circumstances in which a test takes place are, virtually by definition, controlled, while those through which a paper gets written are not. It is possible to monitor the space where a test is being given; it is not possible to monitor where a paper is written. In addition, time is different in each case: a test is designed to be completed at a designated place during a certain period of time, whereas a paper is merely subject to a deadline; it can actually be written over any amount of time, anywhere.

What this means in evaluative practice is not only that the opportunities to cheat (just to continue to use this word) are enormously expanded. The nature of cheating itself changes accordingly -- to the despair of every teacher, beginning with those who teach freshman composition. The very fact that "plagiarism" must be carefully defined there defers to the absence of what the dean in School Ties refers to as a vacuum. (Could cheating even be punished -- in his terms -- if one has to begin by defining it?) It also testifies to the near-impossibility of judging a paper on SUV's or gay marriage or God-knows-what that has been cobbled together out of Internet sources whose fugitive presence, sentence by sentence, is almost undetectable.

Furthermore, to the student these sources may well be almost unremarkable, with respect to his or her own words. What is this business of one's "own words" anyway? What if the very notion has been formed by CNN? How not to visit its site (say) when time comes to write? Most students will be unfamiliar with a theoretical orientation that questions the whole idea of originality. But they will not be unaffected with some consequences, no less than they are unaffected by, say, the phenomenon of sampling and remixing as it takes place in popular culture, especially fashion or music.  

"Plagiarism" has to contend with all sorts of notions of imitation, none of which possess any moral valence. Therefore, plagiarism becomes -- first, if not foremost -- a matter of  interpretive judgment.

Cheating, on the other hand, is not interpretive in the same way (and, in the world of School Ties, not "interpretive" at all). No wonder, in a sense, that test gradually has had to yield to text. It is almost as if the vacuum could not hold. By the present time, the importance of determining grades (in part if not whole) by means of papers acquires the character of a sort of revenge of popular culture -- ranging from cable television to rap music -- upon academic culture.

I do not mean to slight the hundreds or thousands of occasions where tests (beginning with the SAT) remain the evaluative instrument of choice. I do mean to explore why cheating is something enacted today by students who just shrug when told of it. Or becomes something confronting teachers who are perplexed when deciding what to do about it. Are the stakes now simply too low, at least at the undergraduate level? You try (say) to get students to learn some minimal rules about citation. You try to stay away from some more searching consideration about why citation is necessary in the first place. You give as few tests as possible.

Of course, you have to provide grades. So you inflate them. Or rather, the whole academic culture, which breathes inflation, virtually heaves onto the grade sheet a grade that, well, you could justify (you reason), although in an ideal world it would simply be too high. You're not cheating to choose the higher grade. In a decisive sense, it's simply being chosen through you.

The phenomenon known as "grade inflation" is not the same as cheating. Grade inflation simply possesses the immense advantage of grinding up all sorts of edgy moments from both sides of the equation and spewing
them out in discursive mush.

Finally, you also read the newspapers. The other day there was a story about a student from Serbia, a basketball player, whose failing grade was allegedly changed by her Ohio State instructor in "Rural Sociology" after she was asked to do so by an OSU booster. Why? The student was having "personal problems." What exactly were they? Her life might be endangered if she had to return home. Talk about "extenuating circumstances"! How to speak of "honor" -- or whatever would be the value preventing a grade change-- once life itself is at stake? But there was more.

It turns out that the booster says he was asked to speak to the instructor by the basketball coach. It further seems that the booster was also acting as the student's sponsor and host family, a role that involved payment by the university. Suddenly, circumstances shift, and do not seem quite so extenuating (or even the same circumstances).

Meanwhile, it seems grades cannot be changed without the approval of the department chair. So was he or she in on the extenuation? By the end, we are almost in the realm of fiction, and it would make happier sense if the chair were actually Serbian. However, the journalistic report remains unhappily literal, or as much as a wider public will ever learn, anyway.

So what to conclude? That cheating has expanded so much that it now includes or comprehends many routine academic practices (including grade changing, or even a subject such as Rural Sociology)? It's hard to know what to think about cheating anymore, which is one reason why it's easy to relax before movies such
as School Ties.

I missed the fatal exam subject in the movie. Symbolically, it should be Latin. Those were the days! Not the least of the reasons Latin is such an excellent subject was that it appeared to make the determination of whether or not cheating had occurred as clear as the dative case. In contrast, part of the problem with a subject such as Rural Sociology is that all a poor dishonest student can do with it, if pressed, is to plagiarize. Meanwhile, while the rest of us continue to struggle with all manner of extenuating circumstances, "cheating" steals away in quotation marks.   

Author's email: 
caesar@clarion.edu

Terry Caesar's last column was about the physical spaces in which professors teach.

Throat Culture

For the past few days, I've been waiting for a review copy of Bob Woodward's book The Secret Man: The Story of Watergate's Deep Throat to arrive from Simon and Schuster. So there has been some time to contemplate the way that (no longer quite so) mysterious figure has been "inscribed" ina "double register" of "the historical imaginary," as the cult-stud lingo has it. (Sure hope there's a chance to use "imbricated discourse" soon. Man, that would be sweet.)

Putting it in slightly more commonplace terms: Two versions of Deep Throat have taken shape in the past 30 years or so. They correspond to two different ways of experiencing the odd, complex relationship between media and historical memory.

On the one hand, there was Deep Throat as a participant in a real historical event -- making the question of his motivation an important factor in making sense of what happened. It was even, perhaps, the key to understanding the "deep politics" of Watergate, the hidden forces behind Richard Nixon's fall. The element of lasting secrecy made it all kind of blurry, but in a fascinating way, like some especially suggestive Rorschach blot.

On the other hand, there was Deep Throat as pure icon -- a reference you could recognize (sort of) even without possessing any clear sense of his role in Watergate. It started out with Hal Holbrook's performance in All the President's Men -- which, in turn, was echoed by "the cigarette-smoking man" on "The X Files," as well as the mysterious source of insider information about the Springfield Republican Party on "The Simpsons." And so Deep Throat (whose pseudonym was itself originally amovie title) becomes a mediatic signifier unmoored to any historical signified. (An allusion to an allusion to a secret thus forgotten.)

Different as they might be, these two versions of Deep Throat aren't mutually exclusive. The discourses can indeed become imbricated ( yes!), as in the memorable film Dick, which reveals Deep Throat as a pair of idealistic schoolgirls who guide the cluelessly bumbling Woodward and Bernstein through the mysteries of the Nixon White House.

There is something wonderful about this silly premise: In rewriting the history of Watergate, Dick follows the actual events, yet somehow neutralizes their dire logic by just the slightest shift ofemphasis. The deepest secret of an agonizing national crisis turns out to be something absurd.

That perspective is either comically subversive or deeply cynical. Either way, it's been less anticlimactic, somehow, than the revelation of Deep Throat's real identity as the former FBI official Mark Felt. So much for the more elaborate theories about Watergate - that it was, for example, a "silent coup" by a hard-right anticommunist faction of the U.S. military, upset by the administration's dealings with the Soviets and the Chinese. And Deep Throat's role as emblem of noir-ish intrigue may never recover from the impact of the recent, brightly lit video footage of Mark Felt -- half-dazed, half mugging for the camera.

And there have been other disappointments. This week, I had an interesting exchange by e-mail with Bill Gaines, a professor of journalism at the University of Illinois at Urbana-Champaign and two-time winner of the Pulitzer, not counting his two other times as finalist. His part in the DeepThroat saga came late in the story, and it's caused him a certain amount of grief.

But it was also -- this seems to me obvious -- quite honorable. If anything, it is even more worthy of note now that Bob Woodward is telling his side of the story. (While Carl Bernstein also has a chapter in the book, it was Woodward who had the connection with Felt.)

In 1999, Gaines and his students began an investigation designed to determine the identity of Deep Throat. The project lasted four years. It involved sifting through thousands of pages of primary documents and reading acres of Watergate memoir and analysis -- as well as comparing the original articles by Woodward and Bernstein from The Washington Post to the narrative they provided in their book All the President's Men. Gaines also tracked down earlier versions of the manuscript for that volume -- drafted before Woodward decided to reveal that he had a privileged source of inside information.

Gaines and his students compiled a database they used to determine which of the likely candidates would have actually been in a position to leak the information that Deep Throat provided. In April 2003, they held a press conference at the Watergate complex in Washington, DC, where they revealed ... the wrong guy.

After a period of thinking that Deep Throat must have been Patrick Buchanan (once a speechwriter for Nixon), the researchers concluded that it had actually been Fred Fielding, an attorney who had worked as assistant to John Dean. The original report from the project making the case for Fielding is still available online -- now updated with a text from Gaines saying, "We were wrong."

The aftermath of Felt's revelation, in late May, was predictably unpleasant for Gaines. There were hundreds of e-mail messages, and his phone rang off the hook. "Some snickered as if we had run the wrong way with the football," he told me.

But he added, "My students were extremely loyal and have told anyone who will listen that they were thrilled with being a part of this project even though it failed." Some of those who worked on the project came around to help Gaines with the deluge of correspondence, and otherwise lend moral support.

As mistaken deductions go, the argument offered by Gaines and his students two years ago is pretty rigorous. Its one major error seems to have come at an early stage, with the assumption that Woodward's account of Deep Throat was as exact as discretion would allow. That was in keeping with Woodward's own statements, over the years. "It's okay to leave things out to protect the identity of a source," he told the San Francisco Chronicle in 2002, "but to add something affirmative that isn't true is to publish something you know to be an inaccuracy. I don't believe that's ethical for a reporter."

The problem is that the original account of Deep Throat doesn't line up quite perfectly with what is known about Mark Felt. Some of the discrepancies are small, but puzzling even so. Deep Throat is a chain smoker, while Felt claimed to have given up the demon weed in 1943. "The idea that Felt only smokes in the garage [during his secretive rendezvous with Woodward] is a little hard to swallow," says Gaines. "I cannot picture him buying a pack and throwing the rest away for the drama it will provide." By contrast, Fielding was a smoker.

More substantive, perhaps, are questions about what Deep Throat knew and how he knew it. Gaines and his students noted that statements attributed to Deep Throat in All the President's Men were credited to a White House source in the original newspaper articles by Woodward and Bernstein. (Felt was second in command at the FBI, not someone working directly for the White House, as was Fielding.)

Deep Throat provided authoritative information gleaned from listening to Nixon's secret recordings during a meeting in November 1973. That was several months after Felt left the FBI. And to complicate things still more, no one from the FBI had been at the meeting where the recordings were played.

According to Gaines, that means Felt could only have learned about the contents of the recordings at third hand, at best. Felt was, as Gaines put it in an e-mail note, ""so far removed that his comments to Woodward would have to be considered hearsay, and not the kind of thing a reporter could write for fact by quoting an anonymous source."

When I ask Gaines if there is anything he hopes to learn from Bob Woodward's new book, he mentions hoping for some insight into one of the more memorable descriptions of the secret source -- the one about how Deep Throat "knew too much literature too well." In any case, Gaines make a strong argument that Woodward himself took a certain amount of literary license in transforming Felt into Deep Throat.

"We know from our copy of an earlier manuscript that Woodward changed some direct quotes attributed to Throat," he notes. "They were not major changes, but enough to tell us that he was loose with the quotes. There is information attributed to Throat that Felt would not have had, or that doesnot agree with what we found in FBI files."

As the saying has it, journalists write a first draft of history. One of the ethical questions involves trying to figure out just how much discretion they get in polishing the manuscript. Gaines seems careful not to say anything too forceful on this score -- though he does make clear that he isn't charging Woodward with creating a composite character.

That has long been one of the suspicions about Deep Throat. Even the new revelation hasn't quite dispelled it. Just after Felt went public with his announcement, Jon Wiener, a professor of history at the University of California at Irvine, reviewed some of the grounds for thinking that "several people who provided key information ... were turned into a composite figure for dramatic purposes" by Woodward and Bernstein. (You can find more of Wiener's comments here, at the very end of the article.)

For his part, Gaines says that the Deep Throat investigation isn't quite closed -- although he wishes it were. "I have always wanted to move on to something more important for the class project," he told me, "but the students and the media have caused us to keep going back to the Throat story."

Maybe now they should look into the mystery surrounding Deep Throat's most famous line: his memorable injunction to Woodward, "Follow the money."

It appears in the movie version of All the President's Men, though it can't be found in the book. When asked about it in an interview some years ago, Woodward guessed that it was an embellishment by William Goldman, the screenwriter. But Goldman has insisted that he got the line from Woodward.

Now it's part of the national mythology. But it may never have actually happened. Sometimes I wish the discourses would stop imbricating long enough to get this kind of thing sorted out.

Author's email: 
scott.mclemee@insidehighered.com

Rip-Off

"They're just darlings," my co-worker said. "Absolute darlings."

"Uh-huh," I agreed, staring at my grading sheet. She is discussing five athletes at the private, four-year university where we teach. As another part-time foreign-language instructor comes in, I overhear their
conversation.

"Well, they'll never get through nine chapters," said the "darling" woman.

"Oh," responded my friend, a woman who teaches Spanish.

"I'm going back to Chapter Five," said the first instructor, "I just love teaching these darling, darling boys."

I sat there, stunned. Was I hearing correctly? Was she simply dropping half of the curriculum to cater to a few students who couldn't do the work? Later, when we were alone in the office, I commented, "It's so hard to get them to work, but I keep pushing. I've got to get them through the whole book or they're sunk next semester."

"Oh, well, that's how it is with English comp, I'm sure," said the "darling" woman. "I mean you've got to cover the material."

"How is that different with Spanish?" I finally asked.

"Oh, well, I've got to make sure that they really get it." She responded. Frustrated, I couldn't think of anything else to say. This adjunct had developed a curriculum based on department-approved course objectives. She had turned in copies of her syllabus to the academic dean for approval. Then, frustrated by her students' inability or unwillingness to learn, she had simply chopped off the back end of her course.

Later she had confided that there were a few students who were "getting it," but that they would simply have to review the same materials over and over until the end of the semester because she was catering to the athletes. That morning, as my colleague left for her class, I jotted the note, "curriculum rip-off" in my notebook. Something would come of this, I thought. Something.

At lunch that day, with the provost at the head of the table, I commented that a fellow instructor wasn't teaching the curriculum. "What do you mean," the provost asked, voice surprisingly kind for a man in power.

"She said the athletes in her class weren't learning," I paused, unsure if I should go on, "so she cut out the last four chapters of the book."

"You're kidding!" said a physics instructor to my right.

"She'll review them later, right?" the provost asked.

Trembling, I kept my hands in my lap, "I got the impression that she wasn't going to teach the last four chapters at all."

"Really," said the provost. "What's her name?"

"Oh, I really couldn't say," I mumbled, gathering up my half-finished tray.

Face reddening, I made my way to drop off my tray. What had made me speak up? Me, an adjunct? A part-timer with no tenure, no security, no voice. I didn't bring it up again. In the next days, I asked co-workers innocuous questions about their classes. I found it hard to make eye contact with the provost.

What had made me speak up? Anger. A feeling that not only would the next instructors to teach these students be frustrated, their jobs only made that much more difficult, but the students were being ripped off in a wholesale fashion.

According to the students, the less they were taught, the better. But I knew better. And I had been on the receiving end of some of these half-taught students. One of my colleagues at a large community college in California had confessed that he passed any student who would sit through his course. With no work to grade them, he simply gave them all C's. He was not the only one, I realized.

When I had struggled with a student whose grammar was shockingly poor and who could not form a decent paragraph or essay, I sometimes wondered if they had simply tested well on the eligibility exam or if an unwitting colleague had passed them on to me.

And what did the students get out of this? Yes, their semester was easier. Yes, they had less homework. Yes, they could spend more time on sports. But at what cost? Their education was being whittled away by instructors who could not or would not insist on the curriculum. It was a simple matter of trading the short-term for the long-term goal. Given the choice, I knew that a smaller percentage of the students would vote for learning all that they were promised. Yes, some would complain and wheedle, but I must believe that instructors know better.

We are in a position of power and we must not misuse that power by stealing. And when we lop off a part of the curriculum that is too bothersome or too difficult for some students, we are stealing from all of the students. One colleague confessed that she often had to switch lesson plans around to teach what she needed to -- but she always covered the chapters that she had promised.

I'm not sure if she had been burned by a colleague or if she simply knew what the right thing to do was, but I admire her stance. I, too, frequently find that I need to "borrow from Peter to pay Paul" in lesson making, but I always cover the curriculum. Even in the classroom, when I am tempted to cut out a section that once seemed important, I review the materials later in my office and talk to senior instructors who can guide me.

It is dangerous to make impromptu decisions at the chalkboard. More often than not, I am dreaming of new ways to teach something that seems tedious -- a new essay, a new exercise, or examples taken from my own classes. Anything to get them to see the lesson in a new way. My struggle sometimes reminds me of my effort to clip my terrier's nails. After an hour my struggling and his howling, I finally brought my dog to the local veterinarian and paid the $15. His nails did get clipped. In the same way, I struggle with curriculum, but in the end, it gets taught.

My last concern was a big one -- what about our accreditation? This four-year university already had a poor reputation. Once known as a feeder campus for Stanford University, its price tag now seemed to have no correlation to its rigor or value. What if our accreditors found that we were not teaching the curriculum? What if they somehow found out that we were not achieving the course objectives that they had originally approved. What then?

After working on committees at the large community college in California, I had learned a healthy respect for the powers that be. Whether one was a tenured full-time instructor or an adjunct, we simply did not have the right to make such decisions on our own.

Suddenly I was thankful for those who had mentored me -- even those kind souls who sat at lunch with me. Their opinions, ideas and suggestions were helping to shape me. Every day, every semester. So many teachers, struggling, wrangling, working to be sure that curriculum gets taught. What a blessing to be one of those who hold the line. And those who benefit? We do. Instructors, administrators, and, most importantly, the students.

Author's email: 
info@insidehighered.com

Shari Wilson is the pseudonym of an adjunct who has taught at many colleges in California. In a column last month, she wrote about the unintended consquences of the "six year rule" on faculty members who are off the tenure track.

The Corrosion of Ethics in Higher Education

In its 1966 declaration on professional ethics, the American Association of University Professors, the professoriate’s representation organization, states: 

"Professors, guided by a deep conviction of the worth and dignity of the advancement of knowledge, recognize the special responsibilities placed upon them....They hold before them the best scholarly and ethical standards of their discipline.… They acknowledge significant academic or scholarly assistance from (their students)."

Notwithstanding such pronouncements, higher education recently has provided the public with a series of ethical solecisms, most spectacularly the University of Colorado professor Ward Churchill’s recidivistic plagiarism and duplicitous claim of Native American ancestry along with his denunciations of 9/11 victims. While plagiarism and fraud presumably remain exceptional, accusations and complaints of such wrong doing increasingly come to light.

Some examples include Demas v. Levitsky at Cornell, where a doctoral student filed a legal complaint against her adviser’s failure to acknowledge her contribution to a grant proposal; Professor C. William Kauffman’s complaint against the University of Michigan for submitting a grant proposal without acknowledging his authorship; and charges of plagiarism against by Louis W. Roberts, the now-retired classics chair at the State University of New York at Albany. Additional plagiarism complaints have been made against Eugene M. Tobin, former president of Hamilton College, and Richard L. Judd, former president of Central Connecticut State University.

In his book Academic Ethics, Neil Hamilton observes that most doctoral programs fail to educate students about academic ethics so that knowledge of it is eroding. Lack of emphasis on ethics in graduate programs leads to skepticism about the necessity of learning about ethics and about how to teach it. Moreover, nihilist philosophies that have gained currency within the academy itself such as Stanley Fish’s “antifoundationalism” contribute to the neglect of ethics education.
   .
For these reasons academics generally do not seriously consider how ethics education might be creatively revived. In reaction to the Enron corporate scandal, for instance, some business schools have tacked an ethics course onto an otherwise ethically vacuous M.B.A. program. While a step in the right direction, a single course in a program otherwise uninformed by ethics will do little to change the program’s culture, and may even engender cynicism among students.

Similarly, until recently, ethics education had been lacking throughout the American educational system. In response, ethicists such as Kevin Ryan and Karen Bohlin have advocated a radical renewal of ethics education in elementary schools. They claim that comprehensive ethics education can improve ethical standards. In Building Character in Schools, Ryan and Bohlin compare an elementary school to a polis, or Greek city state, and urge that ethics be fostered everywhere in the educational polis.

Teachers, they say, need to set standards and serve as ethical models for young students in a variety of ways and throughout the school. They find that manipulation and cheating tend to increase where academic achievement is prized but broader ethical values are not. They maintain that many aspects of school life, from the student cafeteria to the faculty lounge, ought to provide opportunities, among other things, to demonstrate concern for others. They also propose the use of vision statements that identify core virtues along with the implementation of this vision through appropriate involvement by staff and students.

We would argue that, like elementary schools, universities have an obligation to ethically nurture undergraduate and graduate students. Although the earliest years of life are most important for the formation of ethical habits, universities can influence ethics as well. Like the Greek polis, universities become ethical when they become communities of virtue that foster and demonstrate ethical excellence. Lack of commitment to teaching, lack of concern for student outcomes, false advertising about job opportunities open to graduates, and diploma-mill teaching practices are examples of institutional practices that corrode rather than nourish ethics on campuses.

Competency-based education, broadly considered, is increasingly of interest in business schools.  Under the competency-based approach (advocated, for example, by Rick Boyatzis of Case Western Reserve University, David Whetten of Brigham Young University, and Kim Cameron of the University of Michigan), students are exposed not only to theoretical concepts, but also to specific competencies that apply the theory. They are expected to learn how to apply in their lives the competencies learned in the classroom, for instance those relating to communication and motivating others. Important ethical competencies (or virtues) should be included and fostered alongside such competencies. Indeed, in applied programs such as business, each discipline and subject can readily be linked to ethical virtues. Any applied field, from traffic engineering to finance, can and should include ethical competencies as an integral part of each course. 

For example, one of us currently teaches a course on managerial skills, one portion of which focuses on stress management. The stress management portion includes a discussion of personal mission setting, which is interpreted as a form of stress management. The lecture emphasizes  how ethics can intersect with practical, real world decision making and how it can relate to competencies such as achievement orientation. In the context of this discussion, which is based on a perspective that originated with Aristotle, a tape is shown of Warren Buffett suggesting to M.B.A. students at the University of North Carolina that virtue is the most important element of personal success.

When giving this lecture, we have found that street smart undergraduate business students at Brooklyn College and graduates in the evening Langone program of the Stern School of Business of New York University respond well to Buffett’s testimony, perhaps better than they would to Aristotle’s timeless discussions in Nicomachean Ethics.

Many academics will probably resist integration of ethical competencies into their course curriculums, and in recent years it has become fashionable to blame economists for such resistance.  For example, in his book Moral Dimension, Amitai Etzioni equates the neoclassical economic paradigm with disregard for ethics. Sumantra Ghoshal’s article “Bad Management Theories are Destroying Good Management Practices,” in Academy of Management Learning and Education Journal, blames ethical decay on the compensation and management practices that evolved from economic theory’s emphasis on incentives.

We disagree that economics has been all that influential. Instead, the problem is much more fundamental to the humanities and social sciences and has its root in philosophy. True, economics can exhibit nihilism.  For example, the efficient markets hypothesis, that has influenced finance, holds that human knowledge is impotent in the face of efficient markets. This would imply that moral choice is impotent because all choice is so. But the efficient markets hypothesis is itself a reflection of a deeper and broader philosophical positivism that is now pandemic to the entire academy.
 
Over the past two centuries the assaults on the rational basis for morals have created an atmosphere that stymies interest in ethical education. In the 18th century, the philosopher David Hume wrote that one cannot derive an “ought” from an “is,” so that morals are emotional and cannot be proven true. Today’s academic luminaries have thoroughly imbibed this “emotivist” perspective. For example, Stanley Fish holds that even though academics do exhibit morality by condemning “cheating, academic fraud and plagiarism,” there is no universal morality beyond this kind of “local practice.” 

Whatever its outcome, the debate over the rational derivability of ethical laws from a set of clear and certain axioms that hold universally is of little significance in and of itself.  It will not determine whether ethics is more or less important in our lives; nor will it provide a disproof of relativism -- since defenders of relativism can still choose not to accept the validity of the derivation.

Yet ethics must still be lived -- even though the knowledge, competency, skill or talent that is needed to lead a moral life, a life of virtue, may not be derived from any clear and certain axioms. There is no need for derivation of the need, for instance, for good interpersonal skills. Rather, civilization depends on competency, skill and talent as much as it depends on practical ethics. Ethical virtue does not require, nor is it sustained by, logical derivation; it becomes most manifest, perhaps, through its absence, as revealed in the anomie and social decline that ensue from its abandonment.  Philosophy is beside the point.

Based on much evidence of such a breakdown, ethics education experts such as Thomas Lickona of the SUNY's College at Cortland have concluded that to learn to act ethically, human beings need to be exposed to living models of ethical emotion, intention and habit. Far removed from such living models, college students today are incessantly exposed to varying degrees of nihilism: anti-ethical or disembodied, hyper-rational positions that Professor Fish calls “poststructuralist” and “antifoundationalist.” In contrast, there is scant emphasis in universities on ethical virtue as a pre-requisite for participation in a civilized world. Academics tend to ignore this ethical pre-requisite, preferring to pretend that doing so has no social repercussions.

They are disingenuous – and wrong.

It is at the least counterintuitive to deny that the growing influence of nihilism within the academy is deeply, and causally, connected to increasing ethical breaches by academics (such as the cases of plagiarism and fraud that we cited earlier). Abstract theorizing about ethics has most assuredly affected academics’ professional behavior.

The academy’s influence on behavior extends, of course, far beyond its walls, for students carry the habits they have learned into society at large. The Enron scandal, for instance, had more roots in the academy than many academics have realized or would care to acknowledge. Kenneth Lay, Enron’s former chairman, holds a Ph.D. in economics from the University of Houston.Jeff Skilling, Enron’s former CEO, is a Harvard M.B.A. who had been a partner at the McKinsey consulting firm, one of the chief employers of top-tier M.B.A. graduates. According to Malcolm Gladwell in The New Yorker, Enron had followed McKinsey’s lead, habitually hiring the brightest M.B.A. graduates from leading business schools, most often from the Wharton School. Compared to most other firms, it had more aggressively placed these graduates in important decision-making posts. Thus, the crimes committed at Enron cannot be divorced from decision-making by the best and brightest of the newly minted M.B.A. graduates of the 1990s.

As we have seen, the 1966 AAUP statement implies the crucial importance of an ethical foundation to academic life. Yet ethics no longer occupies a central place in campus life, and universities are not always run ethically. With news of academic misdeeds (not to mention more spectacular academic scandals, such as the Churchill affair) continuing to unfold, the public rightly grows distrustful of universities.

It is time for the academy to heed the AAUP’s 1915 declaration, which warned that if the professoriate “should prove itself unwilling to purge its ranks of … the unworthy… it is certain that the task will be performed by others.” 

Must universities learn the practical value of ethical virtue by having it imposed from without?  Or is ethical revival possible from within? 

Author's email: 
info@insidehighered.com

Candace de Russy is a trustee of the State University of New York and a Hudson Institute Adjunct Fellow. Mitchell Langbert is associate professor of business at Brooklyn College of the City University of New York.

Measures of Success

I only saw her out of the corner of my eye as I rushed into the book exhibit at the conference, but I was sure I knew her. Her face registered as out of context, somehow, but familiar. A second later, I realized it was one of my students, a recent English-major graduate of the liberal arts college at which I teach. I stopped, turned around and called to her.

She was pleased to see me. She’s a marketing assistant for a major academic publishing house, it turns out. I could tell she was proud of her job, pushing English composition and literature texts to English professors like me. We arranged to meet for dinner the next day, two professionals on a business trip.

I stopped by the marketing assistant’s exhibit while she was out at lunch, and her colleagues were anxious to find out how she had been in my classes. "She must have been a great student, huh?" one of her colleagues prompted me. Hmm. She had been solid, reliable, a good writer, and she always had something interesting to say in class, but the marketing assistant had not been one of our stars. Still, none of our stars of recent years had jobs like hers, working with literature.

Clearly her co-workers loved her. They spoke very fondly of her, and, indeed, she seemed to be very good at her job. What I hadn’t noticed in the classroom was the key quality that was working for the marketing assistant in the world after college: not her knowledge of literature but her skills with people. This I discovered very quickly the next evening at dinner.

I had already had a date for dinner that night with a friend of mine, a fiction writer, so I asked the marketing assistant if I could bring him along. "Sure," she said. "I can expense it. I’m just taking two English professors out." A new verb for me: to expense. I liked it.

She quickly took charge of the expedition, finding good restaurants and putting her name on the waiting list of one while we searched for another (Why had I never thought of that? I guess it’s not really cheating).

The marketing assistant had always been ready with an answer in class, but we’d never actually talked much about anything other than Victorian literature. Turns out she’s pretty funny, and very professional. She told great stories, often at the expense of some poor academic schmuck who stopped by her booth, intent on pitching his or her latest project. I felt sorry for the folks she described, but not because she mocked them -- she didn’t; she described them quite affectionately, as if she knew they couldn’t help themselves. The fiction writer and I shook our heads with her when she described the guy whose project was so impossibly narrow that no academic press would ever publish it. We chuckled along, though less heartily, when she wondered aloud at the fashion sense of the professoriate.

"When you look around the gate at the airport, you can always tell who’s going to the same conference you are," the marketing assistant said. Of course, we could, too, and the fiction writer and I had already had that obligatory conversation, this being his first professional conference. But it was different hearing it from the perspective of the marketing assistant. After all, as a friend of mine said ruefully, gazing around the lobby of one of the convention hotels a few years ago, “These are my people.”

When the marketing assistant got to the social skills of professors, we felt ourselves on relatively safe ground. Fiction writer has a fabulous, wry sense of humor and is good to have at parties, and I have always prided myself on being able to talk to anybody. We are not nerdy bookworms -- we both went to our proms. I snickered at her description of awkward social interactions she’d observed between academics. “It’s amazing you guys found people to get hooked up with,” she declared good-naturedly -- in her eyes we were no different from the guy we had just seen mumbling to himself as he wandered through the book exhibit. Maybe we weren’t. Maybe these really were our people.

They weren’t her people; that’s for sure. The marketing assistant had perspective on our folk that we clearly didn’t have. And that made it really fun to talk to her. I enjoyed seeing her in her professional persona. I was proud of her, glad one of our grads seemed to be heading for a successful career in publishing. But seeing her made me realize that I may not be the best assessor of my students’ skills.

Although the marketing assistant is great at her job, I would not have been able to predict that. When I look at my students, I realize, I have always concentrated on particular skills that are not necessarily the ones that will serve them best after college. Who writes the best? Whose research is most thorough? Whose reading of the novel is the most subtle? Not the most marketable skills, though they will get you into graduate school.

The marketing assistant is the same young woman she was when she was in my classroom. But much of her incisive observation, her wit, her distanced assessment and clever summing-up had passed me by when she was in college. What a letter of recommendation I could write for her now. Of course, she doesn’t need it now.  She’s already moved on. 

Author's email: 
info@insidehighered.com

Paula Krebs is professor of English at Wheaton College, in Massachusetts.

Pass It On

One sign of the great flexibility of American English -- if also of its high tolerance for ugliness -- is the casual way users will turn a noun into a verb. It happens all the time. And lately, it tends to be a matter of branding. You "xerox" an article and "tivo" a movie. Just for the record, neither Xerox nor TiVo is  very happy about such unauthorized usage of its name. Such idioms are, in effect, a dilution of the  trademark.

Which creates an odd little double bind for anyone with the culture-jamming instinct to Stick It To The Man. Should you absolutely refuse to give free advertising to either Xerox or TiVo by using their names as verbs, you have actually thereby fallen into line with corporate policy. Then again, if you defy their efforts to police ordinary language, that means repeating a company name as if it were something natural and inevitable. See, that's how they get ya.

On a less antiglobalizational note, I've been trying to come up with an alternative to using "meme" as a verb. For one thing, it is too close to "mime," with all the queasiness that word evokes.

As discussed here on Tuesday, meme started out as a noun implying a theory. It called to mind a more or less biological model of how cultural phenomena (ideas, fads, ideologies, etc.) spread and reproduce themselves over time. Recently the term has settled into common usage -- in a different, if related, sense. It now applies to certain kinds of questionnaires or discussion topics that circulate within (and sometimes between) blogospheric communities.

There does not seem to be an accepted word to name the creation and initial dissemination of a meme. So it could be that "meme" must also serve, for better or worse, as a transitive verb.

In any case, my options are limited.... Verbal elegance be damned: Let's meme.

The ground rules won't be complicated. The list of questions is short, but ought to yield some interesting responses. With luck, the brevity will speed up circulation.

In keeping with meme protocol, I'll "tap"a few bloggers to respond. Presumably they will do likewise. However, the invitation is not restricted to that handful of people: This meme is open to anyone who wants to participate.

So here are the questions:

(1) Imagine it's 2015. You are visiting the library at a major research university. You go over to a computer terminal (or whatever it is they use in 2015) that gives you immediate access to any book or journal article on any topic you want. What do you look up? In other words, what do you hope somebody will have written in the meantime?

(2) What is the strangest thing you've ever heard or seen at a conference? No names, please. Refer to "Professor X" or "Ms. Y" if you must. Double credit if you were directly affected. Triple if you then said or did something equally weird.

(3) Name a writer, scholar, or otherwise worthy person you admire so much that meeting him or her would probably reduce you to awestruck silence.

(4) What are two or three blogs or other Web sites you often read that don't seem to be on many people's radar?
Feel free to discard anything you don't care to answer.

To get things started, I'm going to tap a few individuals -- people I've had only fairly brief contact with in the past. As indicated, however, anyone else who wants to respond is welcome to do so. The initial list:

Okay, that should do for now.

An afterthought on the first question -- the one about getting a chance to look things up in a library of the future: Keep in mind the cautionary example of Enoch Soames, the minor late-Victorian poet whose story Max Beerbohm tells. He sold his soul to the devil for a chance to spend an afternoon in the British Library, 100 years in the future, reading what historians and critics would eventually say about his work.

Soames ends up in hell a little early: The card catalog shows that posterity has ignored him even more thoroughly than his contemporaries did.

Proof, anyway, that ego surfing is really bad for you, even in the future. A word to the wise.

Author's email: 
scott.mclemee@insidehighered.com

Crossing Over

About 10 years ago, while reading a well-known medieval chronicle, I stumbled across an amazing crime story. The case involved a Norman knight, his beautiful young wife and the squire who allegedly raped her in 1386. 

The two men fought a celebrated judicial duel before the French king -- a fight to the death with lance, sword and dagger that also decided the lady’s fate. The affair was still controversial in France at the time I stumbled on the story, and many original documents survived, but no one had ever written a full-length account. Fascinated by the story, I started researching it and eventually began work on a book.

I also began talking with editors, literary agents, and even people connected to the film industry. At one point, I registered some material with the Writers Guild of America to protect my intellectual property. The book was represented briefly by a well-known Hollywood talent agency -- until the firm reorganized and my agent left, orphaning the project. Other literary agents read the proposal and sample chapters, only to turn the project down. Editors at highly respected trade houses read my material but politely rejected it, or hesitated indefinitely. An editor at a leading university press told me my book had "little commercial potential," while an editor at another top academic press read my proposal and offered me a contract right over the phone. Disappointed with the book’s commercial fortunes so far, I was nearly ready to accept the offer.

But around this time a very good literary agency took on the partly completed book, and within three days of putting it on the market they sold it at auction to a division of Random House. Foreign rights sales soon followed, and the deal notice in Publishers Weekly brought new film interest. The book was published last October, became a History Book Club selection, and was featured on NPR’s "Weekend Edition." After its January release in Britain, it was serialized on BBC Radio 4's  "Book of the Week." A BBC television documentary is now in the works.

Although I had published two previous books with university presses, this was my first venture into commercial publishing. Talking with editors and agents, as well as colleagues who had also “crossed over,” taught me a lot about trade publishing, which many scholars regard as the evil twin of the academic press. Some stereotypes are well-founded, as the cautionary tales below bear witness. Others are not, and academic authors can be in for pleasant surprises, as I also found.  Here is what I learned from my experience, arranged in the form of a Q&A.

Do you need an agent to publish with a trade press? No, not absolutely. Some trade presses do not accept "unagented" manuscripts, as a rule, and so it’s hard to get a foot in the door at those houses. But as I found from my own experience, some very good trade editors will look at unagented material, if you can get them interested in it with a good pitch letter (see below). But if you can interest an agent in your work, the agent will get an editor’s attention, probably with faster results and more money. That’s what agents do: use their knowledge, experience, and connections to market your work more effectively and with greater rewards than you could yourself.  

Which brings up a common misconception about agents: that they take your money and give little in return. First, legitimate literary agents never charge a fee up front but work on commission (usually 15 percent). Second, good literary agents always more than earn their commissions by getting you more money than you would ever get on your own. In fact, many agents pride themselves on securing advances that make up for their commissions several times over. If you begrudge agents their well-deserved 15percent, perhaps trade publishing is not for you.

How do you find an agent? Before an agent can sell your book to a publisher, you have to sell your book to an agent. To do that, you have to figure out what kind of book you’ve got. This is harder than it sounds. Most authors consider themselves the world’s expert on their own book, but often they have only a hazy idea of what it is in marketing terms. As the author, you see your book from the inside, but agents (and editors) see it first from the outside. What kind of book is this? What other books is it like? How can I sell it? Will the public buy it? The better you know your book from the inside while seeing it from the outside, the better you’ll be at selling it to an agent. In pitching my commercial book, The Last Duel, to agents, I described it as “the true story of a notorious episode in fourteenth-century France -- a fatal triangle of crime, scandal, and revenge that will excite and fascinate readers with its larger-than-life characters, its air of mystery and intrigue, its many contemporary echoes, and the fact that ‘this really happened.’”

Once you’ve figured out what kind of book you’ve got, you’re ready to look for an agent. Referrals are one way, if you know an author willing to vouch for you and your project. Another way is by sending an unsolicited pitch letter "over the transom" to likely prospects. How do you find prospects? By figuring out who has sold other books like your own. (If you skipped the previous step, figuring out what your book is, go back.) Draw up a list of books like your own -- in topic, genre, approach -- and find out who sold them, and to whom. How do you get this kind of information? By reading the acknowledgments pages of books similar to yours. By attending book talks or writers’ conferences and picking up useful leads. By subscribing to trade magazines or industry Web sites, such as Publishers Weekly and Publishers Lunch, which report on the latest book deals.

I found my own agent not by referral but by reading up on the industry, studying the acknowledgments pages of other books, and sending an over-the-transom pitch. But when I’ve told colleagues who’ve asked my advice about breaking into the trade market that they should read Publishers Weekly and research the industry, they’ve rolled their eyes, as if to say, "You mean this actually involves some work?" If you can’t be bothered to find out about the business you want to join, don’t bother to join the business.

How do you sell your book to an agent?  You need to write a short, punchy pitch letter that brings an agent bolt upright in his or her chair as though a gorilla just charged into the room. That’s a good rule of thumb, anyway. The pitch sells the proposal, which in turn sells the book. To change metaphors, you might think of the pitch, proposal and book (either sample chapters or full manuscript) as the top, middle and bottom of a pyramid, respectively. Since each part must sell the next, each has to embody your best possible writing. A sloppy or half-baked pitch won’t garner a request to see your brilliant proposal. And a botched proposal won’t attract an ounce of interest in your masterpiece of a book. Write every sentence of each part as if your publishing contract depends on it. Because it does.

Will I have to change my writing style to do a trade book? Yes, you’ll have to lose the scholarly jargon and the tangles of theory, presenting the results of your research in clear, exciting prose that people actually want to read. Remember, with a trade book, unlike many academic titles, readers generally will buy your book only if they want to, not because they have to.

The most readable books usually have a narrative thread. They tell a story that draws the reader in with intriguing events and vivid characters. Often they belong to a familiar genre, or they combine multiple genres. For example, Natalie Zemon Davis’s The Return of Martin Guerre (an academic press title that had great crossover success and even became a film) is not just history but also a kind of detective story.

Whether your book is about a historical event, a scientific discovery, a famous person or a not-so-famous person who lived a remarkable life, you need to make the reader want to read the story that originally moved you to write the book. To do this, you have to learn everything you can about your subject, and then forget everything you know -- or at least the fact that you know it -- in order to tell it afresh for others.

A friend of mine -- a very successfully published scholar with a string of popular trade titles -- told me that a literary agent once told him that there’s an important difference between telling the reader what you know, and telling the reader that you know. If academic authors have a fault, it’s the almost irresistible urge to tell readers that we know things. But popular readers are generally a lot less interested in the fact that you know something, or how you know it, than in simply knowing what it is. In early drafts of my own book, I often wrote sentences that began, “According to one medieval chronicler....” My editor finally said, “Just tell us what happens and leave the citation for the notes.”

What if a trade publisher likes the book but asks for major changes of genre, approach, etc.? This is where figuring out your own book, well in advance, is very important. You need to know your book better than anyone else, including your agent, your editor, and the marketing people who eventually get involved in selling and promoting it. Otherwise you run the risk of losing control of your own project, and finding your name on something you wish you hadn’t written.

When The Last Duel was on the market, one publisher weighed in with a nice bid but also said they wanted me to change the title and “make it more of a romance.” Now, my book was a fact-based historical account of an alleged rape that resulted in a legal case and finally a trial by combat -- not a subject that could be turned into a “romance” and still maintain its integrity. I told my agent that the "romance" idea was unacceptable, and that I would not consider this offer, although at the time it was the high bid.

On the other hand, you also have to be able to recognize good advice when you get it -- from your agent, your editor, your spouse, colleagues or anyone else with whom you share your work prior to publication. For example, my editor suggested that I divide my originally 5-chapter book into 10 or even 12 more bite-sized chunks to make it more manageable for readers -- advice I’m glad I followed.

A future article will deal with the writing process, working with a trade editor, and creating a book with the structure and style that a popular audience will actually want to buy and read.

Author's email: 
info@insidehighered.com

Eric Jager is a professor of English at the University of California at Los Angeles, where he teaches medieval literature. He is the author of three books, most recently, The Last Duel:  A True Story of Crime, Scandal, and Trial by Combat in Medieval France ( Broadway, 2004).

Pages

Subscribe to RSS - Thumbnail-horizontal
Back to Top