In its 1966 declaration on professional ethics, the American Association of University Professors, the professoriate’s representation organization, states:
"Professors, guided by a deep conviction of the worth and dignity of the advancement of knowledge, recognize the special responsibilities placed upon them....They hold before them the best scholarly and ethical standards of their discipline.… They acknowledge significant academic or scholarly assistance from (their students)."
Notwithstanding such pronouncements, higher education recently has provided the public with a series of ethical solecisms, most spectacularly the University of Colorado professor Ward Churchill’s recidivistic plagiarism and duplicitous claim of Native American ancestry along with his denunciations of 9/11 victims. While plagiarism and fraud presumably remain exceptional, accusations and complaints of such wrong doing increasingly come to light.
Some examples include Demas v. Levitsky at Cornell, where a doctoral student filed a legal complaint against her adviser’s failure to acknowledge her contribution to a grant proposal; Professor C. William Kauffman’s complaint against the University of Michigan for submitting a grant proposal without acknowledging his authorship; and charges of plagiarism against by Louis W. Roberts, the now-retired classics chair at the State University of New York at Albany. Additional plagiarism complaints have been made against Eugene M. Tobin, former president of Hamilton College, and Richard L. Judd, former president of Central Connecticut State University.
In his book Academic Ethics, Neil Hamilton observes that most doctoral programs fail to educate students about academic ethics so that knowledge of it is eroding. Lack of emphasis on ethics in graduate programs leads to skepticism about the necessity of learning about ethics and about how to teach it. Moreover, nihilist philosophies that have gained currency within the academy itself such as Stanley Fish’s “antifoundationalism” contribute to the neglect of ethics education. . For these reasons academics generally do not seriously consider how ethics education might be creatively revived. In reaction to the Enron corporate scandal, for instance, some business schools have tacked an ethics course onto an otherwise ethically vacuous M.B.A. program. While a step in the right direction, a single course in a program otherwise uninformed by ethics will do little to change the program’s culture, and may even engender cynicism among students.
Similarly, until recently, ethics education had been lacking throughout the American educational system. In response, ethicists such as Kevin Ryan and Karen Bohlin have advocated a radical renewal of ethics education in elementary schools. They claim that comprehensive ethics education can improve ethical standards. In Building Character in Schools, Ryan and Bohlin compare an elementary school to a polis, or Greek city state, and urge that ethics be fostered everywhere in the educational polis.
Teachers, they say, need to set standards and serve as ethical models for young students in a variety of ways and throughout the school. They find that manipulation and cheating tend to increase where academic achievement is prized but broader ethical values are not. They maintain that many aspects of school life, from the student cafeteria to the faculty lounge, ought to provide opportunities, among other things, to demonstrate concern for others. They also propose the use of vision statements that identify core virtues along with the implementation of this vision through appropriate involvement by staff and students.
We would argue that, like elementary schools, universities have an obligation to ethically nurture undergraduate and graduate students. Although the earliest years of life are most important for the formation of ethical habits, universities can influence ethics as well. Like the Greek polis, universities become ethical when they become communities of virtue that foster and demonstrate ethical excellence. Lack of commitment to teaching, lack of concern for student outcomes, false advertising about job opportunities open to graduates, and diploma-mill teaching practices are examples of institutional practices that corrode rather than nourish ethics on campuses.
Competency-based education, broadly considered, is increasingly of interest in business schools. Under the competency-based approach (advocated, for example, by Rick Boyatzis of Case Western Reserve University, David Whetten of Brigham Young University, and Kim Cameron of the University of Michigan), students are exposed not only to theoretical concepts, but also to specific competencies that apply the theory. They are expected to learn how to apply in their lives the competencies learned in the classroom, for instance those relating to communication and motivating others. Important ethical competencies (or virtues) should be included and fostered alongside such competencies. Indeed, in applied programs such as business, each discipline and subject can readily be linked to ethical virtues. Any applied field, from traffic engineering to finance, can and should include ethical competencies as an integral part of each course.
For example, one of us currently teaches a course on managerial skills, one portion of which focuses on stress management. The stress management portion includes a discussion of personal mission setting, which is interpreted as a form of stress management. The lecture emphasizes how ethics can intersect with practical, real world decision making and how it can relate to competencies such as achievement orientation. In the context of this discussion, which is based on a perspective that originated with Aristotle, a tape is shown of Warren Buffett suggesting to M.B.A. students at the University of North Carolina that virtue is the most important element of personal success.
When giving this lecture, we have found that street smart undergraduate business students at Brooklyn College and graduates in the evening Langone program of the Stern School of Business of New York University respond well to Buffett’s testimony, perhaps better than they would to Aristotle’s timeless discussions in Nicomachean Ethics.
Many academics will probably resist integration of ethical competencies into their course curriculums, and in recent years it has become fashionable to blame economists for such resistance. For example, in his book Moral Dimension, Amitai Etzioni equates the neoclassical economic paradigm with disregard for ethics. Sumantra Ghoshal’s article “Bad Management Theories are Destroying Good Management Practices,” in Academy of Management Learning and Education Journal, blames ethical decay on the compensation and management practices that evolved from economic theory’s emphasis on incentives.
We disagree that economics has been all that influential. Instead, the problem is much more fundamental to the humanities and social sciences and has its root in philosophy. True, economics can exhibit nihilism. For example, the efficient markets hypothesis, that has influenced finance, holds that human knowledge is impotent in the face of efficient markets. This would imply that moral choice is impotent because all choice is so. But the efficient markets hypothesis is itself a reflection of a deeper and broader philosophical positivism that is now pandemic to the entire academy.
Over the past two centuries the assaults on the rational basis for morals have created an atmosphere that stymies interest in ethical education. In the 18th century, the philosopher David Hume wrote that one cannot derive an “ought” from an “is,” so that morals are emotional and cannot be proven true. Today’s academic luminaries have thoroughly imbibed this “emotivist” perspective. For example, Stanley Fish holds that even though academics do exhibit morality by condemning “cheating, academic fraud and plagiarism,” there is no universal morality beyond this kind of “local practice.”
Whatever its outcome, the debate over the rational derivability of ethical laws from a set of clear and certain axioms that hold universally is of little significance in and of itself. It will not determine whether ethics is more or less important in our lives; nor will it provide a disproof of relativism -- since defenders of relativism can still choose not to accept the validity of the derivation.
Yet ethics must still be lived -- even though the knowledge, competency, skill or talent that is needed to lead a moral life, a life of virtue, may not be derived from any clear and certain axioms. There is no need for derivation of the need, for instance, for good interpersonal skills. Rather, civilization depends on competency, skill and talent as much as it depends on practical ethics. Ethical virtue does not require, nor is it sustained by, logical derivation; it becomes most manifest, perhaps, through its absence, as revealed in the anomie and social decline that ensue from its abandonment. Philosophy is beside the point.
Based on much evidence of such a breakdown, ethics education experts such as Thomas Lickona of the SUNY's College at Cortland have concluded that to learn to act ethically, human beings need to be exposed to living models of ethical emotion, intention and habit. Far removed from such living models, college students today are incessantly exposed to varying degrees of nihilism: anti-ethical or disembodied, hyper-rational positions that Professor Fish calls “poststructuralist” and “antifoundationalist.” In contrast, there is scant emphasis in universities on ethical virtue as a pre-requisite for participation in a civilized world. Academics tend to ignore this ethical pre-requisite, preferring to pretend that doing so has no social repercussions.
They are disingenuous – and wrong.
It is at the least counterintuitive to deny that the growing influence of nihilism within the academy is deeply, and causally, connected to increasing ethical breaches by academics (such as the cases of plagiarism and fraud that we cited earlier). Abstract theorizing about ethics has most assuredly affected academics’ professional behavior.
The academy’s influence on behavior extends, of course, far beyond its walls, for students carry the habits they have learned into society at large. The Enron scandal, for instance, had more roots in the academy than many academics have realized or would care to acknowledge. Kenneth Lay, Enron’s former chairman, holds a Ph.D. in economics from the University of Houston.Jeff Skilling, Enron’s former CEO, is a Harvard M.B.A. who had been a partner at the McKinsey consulting firm, one of the chief employers of top-tier M.B.A. graduates. According to Malcolm Gladwell in The New Yorker, Enron had followed McKinsey’s lead, habitually hiring the brightest M.B.A. graduates from leading business schools, most often from the Wharton School. Compared to most other firms, it had more aggressively placed these graduates in important decision-making posts. Thus, the crimes committed at Enron cannot be divorced from decision-making by the best and brightest of the newly minted M.B.A. graduates of the 1990s.
As we have seen, the 1966 AAUP statement implies the crucial importance of an ethical foundation to academic life. Yet ethics no longer occupies a central place in campus life, and universities are not always run ethically. With news of academic misdeeds (not to mention more spectacular academic scandals, such as the Churchill affair) continuing to unfold, the public rightly grows distrustful of universities.
It is time for the academy to heed the AAUP’s 1915 declaration, which warned that if the professoriate “should prove itself unwilling to purge its ranks of … the unworthy… it is certain that the task will be performed by others.”
Must universities learn the practical value of ethical virtue by having it imposed from without? Or is ethical revival possible from within?
Candace de Russy and Mitchell Langbert
Candace de Russy is a trustee of the State University of New York and a Hudson Institute Adjunct Fellow. Mitchell Langbert is associate professor of business at Brooklyn College of the City University of New York.
During the heyday of American economic and geographical expansion, in the late 19th century, the men who sold real estate occupied a distinct vocational niche. They were slightly less respectable than, say, riverboat gamblers -- but undoubtedly more so than pirates on the open seas. It was a good job for someone who didn’t mind leaving town quickly.
They created local realty boards and introduced licensing as means by which reputable practitioners could distinguish themselves from grifters. And in time, they were well enough organized to lobby the federal government on housing policy –- favoring developments that encouraged the building of single-family units, rather than public housing. Their efforts, as Hornstein writes, "would effectively create a broad new white middle class haven in the suburbs, while leaving behind the upper class and the poor in cities increasingly polarized by race and wealth."
I picked up A Nation of Realtors expecting a mixture of social history and Glengarry Glen Ross. It's actually something different: a contribution to understanding how certain aspects of middle-class identity took shape -- both among the men (and later, increasingly, women) who identified themselves as Realtors and among their customers. Particularly interesting is the chapter "Applied Realology," which recounts the early efforts of a handful of academics to create a field of study that would then (in turn) bolster the profession’s claims to legitimacy and rigor.
Hornstein recently answered a series of questions about his book -- a brief shift of his attention back to scholarly concerns, since he is now organizing director of Service Employees International Union, Local 36, in Philadelphia.
Q:Before getting to your book, let me ask about your move from historical research to union organizing. What's the story behind that?
A: I was applying to graduate school in my senior year of college and my advisor told me that while he was sure I could handle grad school, he saw me as more of "a politician than a political scientist." I had always been involved in organizing people and was a campus leader. But I also enjoyed academic work, and went on to get two graduate degrees, one in political science from Penn, another in history from the University of Maryland.
While I was doing the history Ph.D. at Maryland, a group of teaching assistants got together and realized that we were an exploited group that could benefit from a union. Helping to form an organizing committee, affiliating with a national union, getting to know hard-boiled organizers (many of whom were also intellectuals), and attempting to persuade my peers that they needed to take control of their own working conditions through collective action captured my imagination and interest much more than research, writing, or teaching.
After a long intellectual and personal journey, I finally defended my dissertation. The academic job market looked bleak, particularly as a graduate of a non-elite institution. And when I was honest with myself, I realized that my experience forming a graduate employee union engaged me far more than the intellectual work.
Armed with this insight, I put the diss in a box, and two weeks later, I was at the AFL-CIO’s Organizing Institute getting my first taste of what it would be like to organize workers as a vocation. In the dark barroom in the basement of the George Meany Center for Labor Studies, a recruiter from an SEIU local in Ohio approached me and asked me if I’d like to spend the next few years of my life living in Red Roof Inns, trying to help low-wage workers improve their lives. Two weeks later, I landed in Columbus, Ohio and I was soon hooked.
And I would add this: The supply of talented and committed organizers is far outstripped by the demand. The labor movement’s current crisis is, frankly, a huge opportunity for energetic and idealistic people to make a real difference. Hard work and commitment is really rewarded in the labor movement, and one can move quickly into positions of responsibility. It’s very demanding and often frustrating work, but it’s about as fulfilling a vocation as I could imagine.
Q:You discuss the emergence of realtors as the rise of a new kind of social identity, "the business professional." But I'm left wondering about early local real-estate boards. They sound kind of like lodges or fraternal groups, as much as anything else. In what sense are they comparable to today's professional organizations, as opposed to, say, the Elks or the Jaycees?
A: Indeed, early boards were very much like fraternal organizations. They were all male and clubby, there was often a "board home" that offered a retreat space, and so on. Early real estate board newsletters are rife with the sorts of jokes about women and minorities that were standard fare in the 1910s and 1920s -- jokes that, I argue, help to police the boundaries of masculinity.
In the early chapters of the book, I provide brief sketches of the workings of the Chicago and Philadelphia real estate boards, as well as a sort of anthropological view of early real estate conventions. My favorite was the 1915 Los Angeles convention, during which the main social event was a drag party. In my view, the conventions, the board meetings, the social events, the publications, all formed a homosocial space in which a particular sort of masculinity was performed, where the conventions of middle-class masculinity were established and reinforced.
In the early 1920's, the emphasis began to shift from fraternalism to a more technocratic, professional modality. Herbert Nelson took the helm at the National Association of Real Estate Boards in 1923, and he started to make NAREB look much more like a modern professional organization. In some respects he created the mold. He made long-term strategic plans, asserted the necessity for a permanent Realtor presence in Washington, D.C., pushed for standards for licensing, worked with Herbert Hoover’s Commerce Department to promulgate a standard zoning act, and linked up with Professor Richard T. Ely [of the University of Wisconsin at Madison] to help "scientize" the field.
Nelson served as executive director of NAREB for over 30 years. During his tenure, the organization grew, differentiated, specialized, and became a powerful national political actor. In sum, it became a true modern professional association in most ways. Yet like most other professional organizations prior to the ascendancy of feminism and the major incursion of women into the professions, masculine clubbiness remained an important element in the organizational culture well into the 1970s.
In sum, the story I tell about the complex interdependencies of class, gender, and work identities is largely about the Realtors’ attempts to transform an Elks-like organization into a modern, "professional" business association.
Q:On the one hand, they see what they are doing as a kind of applied social science -- also creating, as you put it, "a professional metanarrative." On the other hand, you note that Ely's Institute for Research in Land Economics was a casualty of the end of the real estate bubble. Doesn't that justify some cynicism about realtors' quest for academic legitimacy?
A: I don’t see the Realtors or the social scientists like Ely in cynical terms at all. In fact, both parties are quite earnest about what they’re doing, in my view. Ely was nothing if not a true believer in the socially transformative power of his research and of social scientific research in general. He managed to persuade a faction of influential Realtors, primarily large-scale developers ("community-builders") such as J.C. Nichols, that research was the key to professionalism, prosperity, and high-quality real estate development. Ely’s Institute was not a casualty of the implosion of the 1926 Florida real estate bubble as such. But the real estate collapse and the ensuing Depression made it much harder for the Realtors to make claims to authority based on disinterested science.
It’s not that the grounding of the whole field of Land Economics was problematic – at least no more so than any other field of social or human science, particularly one that produces knowledge that can be used for commercial purposes.
The academic field was in its infancy in the 1910s and 1920s, and there were intra-disciplinary squabbles between the older, more historical economists like Ely and the younger generation, which was much more model- and mathematics-driven. At the same time, there were sharp divisions among Realtors between those who believed that professionalism required science (and licensing, and zoning, and so on) and those who rejected this idea.
So, yes, the Elyian attempt at organizing the real estate industry on a purely ‘scientific’ basis, operating primarily in the interest of the social good, was largely a failure. However, the 1920s mark a watershed in that the National Association became a major producer and consumer of social scientific knowledge. Business schools began to offer real estate as a course of study. Textbooks, replete with charts and graphs and economic equations, proliferated. Prominent academics threw their lot in with the Realtors.
In the end, the industry established its own think tank, the Urban Land Institute, the motto of which is “Under All, The Land” -- taken straight from Ely’s work. But the profession itself remained divided over the value of ‘science’ – the community-builders generally supported efforts to scientize the field, while those on the more speculative end of the profession were generally opposed.
But again, I don’t think that the grounding of the field of land economics is any more questionable than any other subfield of economics, such as finance or accounting.
Q:Your book left me with a sort of chicken-and-egg question. You connect the growth of the profession with certain cultural norms -- the tendency to define oneself as middle-class, the expectation of private home ownership, etc. Didn't those aspirations have really deep roots in American culture, which the Realtors simply appealed to as part of their own legitimization? Or were they more the result of lobbying, advertising, and other activities of the real-estate profession?
A: Absolutely, these tendencies have roots deep in American culture. The term "middle class" was not really used until the late 19th century -- "middling sorts" was the more prevalent term before then. The "classless society" has long been a trope in American culture, the idea that with hard work, perseverance, and a little luck, anyone can "make it" in America, that the boundaries between social positions are fluid, etc.
But it’s not until the early-to-mid 20th century that homeownership and middle-class identity come to be conflated. The "American Dream" is redefined from being about political freedom to being about homeownership. At around the same time, debt is redefined as "credit" and "equity."
So, yes, I ‘d agree to some extent that the Realtors tapped into longstanding cultural norms as part of their efforts at self-legitimization. Like most successful political actors, they harnessed cultural commonsense for their own ends – namely, to make homeownership integral to middle-class identity. Their political work enabled them, in the midst of the Depression, to get the National Housing Act passed as they wrote it -- with provisions that greatly privileged just the sort of single-family, suburban homes leading members of NAREB were intent on building.
The Realtors used the cultural material at hand to make their interests seem to be the interests of the whole society. But, as we know from many fine studies of suburban development, many people and many competing visions of the American landscape were marginalized in the process.
Bait and Switch: The (Futile) Pursuit of the American Dream, published this week by Metropolitan Books, is a return to matters that Barbara Ehrenreich has written about in the past. And no, I don't just mean the world of economic hard knocks.
In obvious ways, the new book's narrative of trying to get a white-collar corporate job (say, as a public-relations person) is similar in method and tone to Nickel and Dimed (2001), her account of the lives the working poor. Both are works of first-person reporting a la George Orwell's Road to Wigan Pier -- treading the fine line between investigative journalism and participant-observer ethnography, with the occasional dash of satire thrown in.
But Ehrenreich's new book also revisits a world first explored in her early work on "the professional-managerial class" (often abbreviated as PMC). In papers written during the late 1970s with her first husband, John Ehrenreich, she worked out an exacting Marxist analysis of the PMC as "consisting of salaried mental workers who do not own the means of production" (hence aren't capitalists) but whose "major function in the social division of labor may be broadly described as the reproduction of capitalist culture and capitalist relations." Ehrenreich revisited the topic, in a more popular vein, with Fear of Falling: The Inner Life of the Middle Class (1989).
You don't hear any trace of sociological diction in Ehrenreich's latest book, in which she goes undercover as "Barbara Alexander," a homemaker with some work experience in writing and event-planning. (Alexander's resume is a more modest rewriting of Ehrenreich's own background as academic and journalist.) Her search for a new job puts her in competition with other casualties of downsizing and midlife unemployment. She spends her time reading Monster.com, not Louis Althusser.
But some of Ehrenreich's old theoretical concerns do pop up as she tries to land a gig on the lower rungs of the PMC hierarchy. More than a quarter century ago, she had written that the private life of the middle class "becomes too arduous to be lived in private: the inner life of the PMC must be continuously shaped, updated and revised by ... ever mounting numbers of experts." And so Barbara Alexander finds teams of "career consultants" ready to help her adjust her outlook to fit into the new corporate culture. How? Through the modern science of psychobabble.
After reviewing Bait and Switch for Newsday, I still had some questions about where the book fit into Ehrenreich's thinking. Happily, she was willing to answer them by e-mail.
Q: Nickel and Dimed has become a standard reading assignment for undergraduates over the past few years, and some of that audience must now be entering the white-collar job market you describe in Bait and Switch. Is there anything in the new book intended as guidance for readers who will be facing that reality?
A: I'd like to reach undergraduates with Bait and Switch before they decide on a business career. I'm haunted by the kid I met at Siena College, in N.Y., who told me he was really interested in psychology, but since that isn't "practical," he was going into marketing, which draws on psychology -- though, as this fellow sadly admitted, only for the purpose of manipulating people. Or the gal I met at University of Oregon who wants to be a journalist but is drifting toward PR so she can make a living.
Right now, business is the most popular undergraduate major in America, largely because young people believe it will lead to wealth or at least security. I want them to rethink that decision, or at least do some hard thinking about what uses they would like apply their business skills to.
There's not much by way of individual guidance in Bait and Switch, but I do want to get people thinking more about corporate domination, not only of the economy, but of our psyches. Generally speaking, the corporations have us by the short hairs wherever you look, and of course, one source of their grip is the idea that they are the only or the major source of jobs. I'm asking, what kind of jobs -- back-breaking low-wage jobs as in Nickel and Dimed, or transient, better-paid jobs that seem to depend heavily on one's ability to be a suck-up, as in Bait and Switch?
Q:The pages in Bait and Switch devoted to New Age-inflected business-speak are quite funny -- but in an angry way. How much do you think people really buy into this ideology? Do they take it seriously? Or is it just something you have to repeat, to be part of the tribe?
A: Well, someone must believe it, or there wouldn't be any market for all the business advice books spewed out by career coaches and management gurus. I had the impression that the job seekers I was mingling with usually thought they should believe it all, or at least should act as if they believe it all. There certainly seems to be a lot of fear of being different or standing out in any way.
Q:What's the relationship between the world you are describing in the new book and that of the professional-managerial class? Are business professionals fully fledged members of the PMC? Or are they clueless and self-deluding mimics of it? All of the above?
A: Sure, they're bona fide members of the PMC as John Ehrenreich and I defined it in the 70s; they are college-educated and they command others or at least determine the work that others will do. But your question makes me think that an update on the PMC is long overdue.
In the late 80s, when I wrote Fear of Falling, it looked like the part of the PMC employed as corporate operatives was doing pretty well compared to the more academic and intellectual end of the PMC, which was beginning to get battered by HMOs (in the case of physicians), budget cuts (in the case of college professors, social workers, and others), etc.
Starting in the late 80s, though -- and insufficiently noted by me at the time -- the corporate operative-types began to lose whatever purchase they had on stability. First there were the mergers and acquisitions of the 80s, which inevitably led to white collar job loss; then there was the downsizing of the 90s; and now of course the outsourcing of many business-professional functions. So no one is safe.
Q: Do people in this sphere have any way to win a degree of real control over their economic condition? If they don't have some regulation of the market for their labor via certification (i.e. real professionalization) and they find it unimaginable to be unionized, does that leave them any options?
A: No. As a blue collar union friend of mine commented: They bought the line, they never had any concept of solidarity, and now they're sunk.
Q: In reporting this book, you created an alter ego, "Barbara Alexander," who is not the same person as Barbara Ehrenreich. But she's not totally different, either. There is a degree of overlap in age, background, work experience, etc. The job search proves fairly humiliating for Barbara Alexander. Was it hard to keep some distance from the role? It felt like she might explode a few times.
A: Remember, "Barbara Alexander" was just my cover; I only distanced myself enough to be a fairly low-key observer/reporter. Hence no tantrums or crazed rants. So yes, a certain amount of self-control was necessary, and it did take its toll. I often felt extremely soiled, compromised and generally yucky about the whole venture.
By which I don't mean I'm too pure to be involved in the great corporate money-making machine (my books, after all, are published by a large corporation and I happily accept my royalties) but that I was trying to act like someone I'm not and that I suspect very few people are, i.e., the endlessly upbeat, compliant, do-with-me-what-you-will corporate employee.
Q: Some aspects of the labor market you describe in Bait and Switch sound comparable to trends emerging in parts of academe. Any thoughts on that score? Have you considered writing, say, Ivy and Adjunct?
A: You want me to go undercover as an adjunct? No way. First, I've been an adjunct, years ago, at both NYU and the College of New Rochelle, and I understand the pay hasn't improved since then. So sorry, that option is no more enticing than another stint at Wal-Mart.
Someone should write about it though. The condition of adjuncts, who provide the bulk of higher ed in this country, is an absolute scandal. I've met adjuncts who moonlight as maids and waitresses, and I've read about homeless ones. If the right is so worried about the academy being too left wing, they should do something about the treatment of adjuncts (and many junior faculty.) There's something about hunger that has a way of turning people to the left.
The wedding announcements in The New York Times are, as all amateur sociologists know, a valuable source of raw data concerning prestige-display behavior among the American elite. But they do not provide the best index of any individual’s social status. Much more reliable in that respect are the obituaries, which provide an estimate of the deceased party’s total accumulated social capital. They may also venture a guess, between the lines, about posterity’s likely verdict on the person.
In the case of John Kenneth Galbraith, who died last week, the Times obituary could scarcely fail to register the man’s prominence. He was an economist, diplomat, Harvard professor, and advisor to JFK. Royalties on his book The Affluent Society (1958) guaranteed that -- as a joke of the day had it -- he was a full member. But the notice also made a point of emphasizing that his reputation was in decline. Venturing with uncertain steps into a characterization of his economic thought, the obituary treated Galbraith as kind of fossil from some distant era, back when Keynsian liberals still roamed the earth.
He was patrician in manner, but an acid-tongued critic of what he once called "the sophisticated and derivative world of the Eastern seaboard." He was convinced that for a society to be not merely affluent but livable (an important distinction now all but lost) it had to put more political and economic power in the hands of people who exercised very little of it. It was always fascinating to watch him debate William F. Buckley -- encounters too suave to call blood sport, but certainly among the memorable moments on public television during the pre-"Yanni at the Acropolis" era. He called Buckley the ideal debating partner: “pleasant, quick in response, invulnerable to insult, and invariably wrong.”
Galbraith’s influence was once strong enough to inspire Congressional hearings to discuss the implications of his book The New Industrial State (1967). Clearly that stature has waned. But Paul Samuelson was on to something when he wrote, “Ken Galbraith, like Thorstein Veblen, will be remembered and read when most of us Nobel Laureates will be buried in footnotes down in dusty library stacks.”
The reference to the author of The Theory of the Leisure Class is very apropos, for a number of reasons. Veblen’s economic thought left a deep mark on Galbraith. That topic has been explored at length by experts, and I dare not bluff it here. But the affinity between them went deeper than the conceptual. Both men grew up in rural areas among ethnic groups that never felt the slightest inferiority vis-a-vis the local establishment. Veblen was a second-generation Norwegian immigrant in Wisconsin. Galbraith, whose family settled in a small town in Canada, absorbed the Scotch principle that it was misplaced politeness not to let a fool know what you thought of him. “Better that he be aware of his reputation,” as Galbraith later wrote, “for this would encourage reticence, which goes well with stupidity.”
Like Veblen, he had a knack for translating satirical intuitions into social-scientific form. But Galbraith also worked the other way around. He could parody the research done by “the best and the brightest,” writing sardonically about what was really at stake in their work.
I’m thinking, in particular, of The McLandress Dimension (1963), a volume that has not received its due. The Times calls it a novel, which only proves that neither of the two obituary writers had read the book. And it gets just two mentions, in passing, in Richard Parker’s otherwise exhaustive biography John Kenneth Galbraith: His Life, His Politics, His Economics (Farrar, Straus, and Giroux, 2005).
While by no means a major work, The McLandress Dimension deserves better than that. Besides retrieving the book from obscurity, I’ll take a quick look at a strange episode in its afterlife.
The McLandress Dimension, a short collection of articles attributed to one “Mark Epernay,” was published by Houghton Mifflin during the late fall of 1963. At the time, Galbraith was the U.S. ambassador to India. Portions of the book had already appeared in Esquire and Harper’s. One reviewer, who was clearly in on the joke, introduced Mark Epernay as “a gifted young journalist who has specialized in the popularization -- one might almost say the vulgarization -- of what one has learned to call the behavioral sciences.”
The pen name combined an allusion to Mark Twain with a reference to a town in France that Galbraith had come across in a book about the Franco-Prussian war. (Either that, or on the side of a wine crate; he was not consistent on this point.) “The pseudonym was necessary because I was then an ambassador,” recalled Galbraith in a memoir, “and the State Department required its people to submit their writing for review while forbidding them to take compensation for it.... However, it did not seem that this rule need apply to anything written in true anonymity under a false name. Accordingly, I wrote to the then Attorney General, Mr. Robert Kennedy, proposing that I forego the clearance and asking if I might keep the money. So difficult was the question or so grave the precedent that my letter was never answered.”
But Epernay was just the foil for Galbraith’s real alter ego -- the famous Herschel McLandress, the former professor of psychiatric measurement at the Harvard Medical School and chief consultant to the Noonan Psychiatric Clinic in Boston. The researcher was a frequent recipient of grants from the Ford Foundation, the Rockefeller Foundation, and sundry other nonprofit geysers of soft money. His ideas were the subject, as Epernay put it, “of some of the most trenchant debates in recent years at the Christmas meetings of the American Association for Psychometrics.” While his name was not yet a household word, McLandress had an impressive (if top-secret) list of clients among prominent Americans.
The work that defined his career was his discovery of “the McLandress Coefficient” – a unit of measurement defined, in laymen’s terms, as “the arithmetic mean or average of intervals of time during which a subject’s thoughts centered on some substantive phenomenon other than his own personality.”
The exact means of calculating the “McL-C,” as it was abbreviated, involved psychometric techniques rather too arcane for a reporter to discuss. But a rough estimate could be made based on how long any given person talked without using the first-person singular pronoun. This could be determined “by means of a recording stopwatch carried unobtrusively in the researcher’s jacket pocket.”
A low coefficient -- anything under, say, one minute -- “implies a close and diligent concern by the individual for matters pertaining to his own personality.” Not surprisingly, people in show business tended to fall into this range.
Writers had a somewhat higher score, though not by a lot. Epernay noted that Gore Vidal had a rating of 12.5 minutes. Writing in The New York Review of Books, Vidal responded, ““I find this ... one finds this odd.”
What drew the most attention were the coefficients for various political figures. Nikita Khrushchev had the same coefficient as Elizabeth Taylor – three minutes. Martin Luther King clocked in at four hours. Charles de Gaulle was found to have the very impressive rating of 7 hours, 30 minutes. (Further studies revealed this figure to be somewhat misleading, because the general did not make any distinction between France and himself.) At the other extreme was Richard Nixon, whose thoughts never directed beyond himself for more than three seconds.
Epernay enjoyed his role as Boswell to the great psychometrician. Later articles discussed the other areas of McLandress’s research. He worked out an exact formula for calculating the Maximum Prestige Horizon of people in different professions. He developed the “third-dimensional departure” for acknowledging the merits of both sides in any controversial topic while carefully avoiding any form of extremism. (This had been mastered, noted Epernay, by “the more scholarly Democrats.”)
And McLandress reduced the size of the State Department by creating a fully automated foreign policy -- using computers to extrapolate the appropriate response to any new situation, based on established precedent. “Few things more clearly mark the amateur in diplomacy,” the reporter explained, “than his inability to see that even the change from the wrong policy to the right policy involves the admission of previous error and hence is damaging to national prestige.”
One piece in the book covered the life and work of someone who has played a considerable role in the development of the modern Republican Party, though neither Galbraith nor Epernay could have known that at the time.
The figure in question was Allston C. Wheat, “one of the best tennis players ever graduated from Cornell” as well as a very successful “wholesaler of ethical drugs, antibiotics, and rubber sundries in Philadelphia.” Upon retirement, Wheat threw himself into he writings of Ludwig von Mises, Ayn Rand, and Barry Goldwater, among others. His studies left Wheat sorely concerned about the menace of creeping socialism in America. As well he might be. Certain developments in the American educational system particularly raised his ire. Wheat raised the alarm against that insidious subversive indoctrination in collectivist ideology known as “team sports.”
“Every healthy able-bodied young American is encouraged to participate in organized athletic events,” Wheat noted in a widely-circulated pamphlet. This was the first step in brainwashing them. For an emphasis on “team spirit” undermines good, old-fashioned, dog-eat-dog American individualism. “The team,” he warned, “is the social group which always comes first.... If you are looking for the real advance guard for modern Communism, you should go to the field-houses and the football stadiums.”
The tendency of the Kennedys to play touch football at family gatherings proved that “they are collectivist to the core.” And then there was the clincher: “Liberals have never liked golf.”
Wheat’s dark suspicions had a solid historical basis. “In 1867,” Epernay pointed out in a footnote, “the first rules for college football were drawn up in Princeton, New Jersey. That was the year of the publication of Das Kapital.... Basketball was invented in 1891 and the Socialist Labor Party ran its first candidate for President in the following year.” Coincidence? Don’t be gullible. As the saying has it, there’s no “I” in “team.”
The goal of Wheat’s movement, the Campaign for Athletic Individualism, was to ensure that young people’s McLandress Coefficients were low enough to keep America free. Today, Wheat has been forgotten. No doubt about it, however: His legacy grows.
In many ways,The McLandress Dimension was in many ways a product of its moment -- that is, Camelot, the early 60s, a time of heavy traffic on the wonky crossroads where social science and public policy meet.
Books like Vance Packard’s The Status Seekers were showing that the American social hierarchy, while in transition, was very much in place. A celebrity culture in the arts, politics, and academe was emerging to rival the one based in Hollywood. The sort of left-liberal who read Galbraith with approval could assume that the McCarthyist worldview belonged in the dustbin of history.
The McLandress Dimension satirized all these things -- but in a genial way. It said, in effect: “Let’s not be too serious about these things. That would be stupid.”
So Galbraith’s timing was good. But it was also, in a way, terrible. Articles about the book started appearing in early December -- meaning they had been written at least a few weeks earlier, before the assassination of the president. There was a lightheartedness that must have been jarring. Most of the reviewers played along with the gag. One magazine sent a telegram to the embassy in India, asking Galbraith, “Are you Mark Epernay?” He cabled back, ”Who’s Mark Epernay?”
But the season for that kind of high spirits was over. If Herschell McLandress was the embodiment of the number-crunching technocratic mentality in 1963, his place in the public eye was soon taken by Robert McNamara. Such “extremists in defense of liberty” as Allston Wheat were trounced during the 1964 presidential campaign -- only to emerge from it stronger and more determined than ever. Galbraith’s serious writings were a major influence on the Great Society programs of the Johnson administration. But that consummation that was also, with hindsight, a swan’s song.
As for The McLandress Dimension itself, the writings of Mark Epernay found a place in the bibliographies of books on Galbraith. But they were ignored even by people writing on the development of his thought. I recently did a search to find out if anyone ever cited the work of Herschel McLandress in a scholarly article, perhaps as an inside joke. Alas, no. All that turns up in JSTOR, for example, is a brief mention Galbraith’s book in an analysis of the humorous literature on Richard Nixon. (There is, incidentally, rather a lot of it.)
And yet the story does not quite end there.
In 1967, the Dial Press issued Report from Iron Mountain: On the Possibility and Desirability of Peace, which the publisher claimed was in fact a secret government document. The topic was the socio-economic implications of global peace. It was prepared, according to the introduction, by a group of prominent but unnamed social scientists. The prose was leaden, full of the jargon and gaseous syntax of think-tank documents.
The challenge facing the Iron Mountain group, it seemed, was to explore any adverse side-effects of dismantling the warfare state. The difficulties were enormous. Military expenditures were basic to the economy, they noted. Threat from an external enemy fostered social cohesion. And the Army was, after all, a good place for potentially violent young men.
It would be necessary to find a way to preserve all the useful aspects of war preparation, and to contain all the problems it helped solve. A considerable amount of social restructuring would be required should the Cold War end. The think tank proposed various options that leaders might want to keep in mind. It could prove necessary to sponsor new forms of extremely violent entertainment, introduce slavery, and concoct a plausible story about the threat of extraterrestrial invasion.
This was, of course, a satire on the “crackpot realism” (as C. Wright Mills once termed it) of the Rand Institute and the like. It was concocted by Leonard Lewin, a humor writer, and Victor Navasky, the editor of The Nation. But the parody was so good as to be almost seamless. It proposed the most extreme ideas in an incredibly plodding fashion. And the scenarios were only marginally more deranged-sounding than anything mooted by Herman Kahn, the strategist of winnable thermonuclear war.
Serious journals devoted articles to debating the authenticity of the document. One prominent sociologist wrote a long article suggesting that it was so close to the real thing that one might as well take it seriously. At one point, people in the White House were reportedly making inquiries to determine whether Report from Iron Mountain might not be the real thing.
In the midst of all this, Herschel McLandress, who had retreated into silence for almost four years, suddenly returned to public life. In an article appearing in The Washington Post, the great psychometrician confirmed that Report from Iron Mountain was exactly what it claimed to be. He had been part of the working group involved in the initial brainstorming. He chided whoever was responsible for leaking the document. By no means were Americans ready to face the horrors of peace. He did not challenge any of the report’s conclusions. “My reservations,” McLandress stated, “relate only to the wisdom of releasing it to an obviously unconditioned public.”
Writing from behind his persona, Galbraith turned in a credible impression of social-science punditry at its most pompous. (You can read the entire review here.) It must have been very funny if you knew what was going on. And presumably some people did remember that McLandress was himself a figment of the imagination.
But not everyone did. Over time, Report from Iron Mountain became required reading for conspiracy theorists -- who, by the 1990s, were quite sure it was a blueprint for the New World Order. After all, hadn’t a reviewer vouched for its authenticity in The Washington Post?
And what did Galbraith think of all this? I have to.... One has to wonder.
Last fall in the section I teach of introductory microeconomics, I asked a student a simple question about the demand and supply of gutters. Nora had a blank expression, one that said, “I haven’t a clue of what you’re talking about.” If Nora had been struggling to understand economics, I wouldn’t have thought a thing about it. But Nora is a star, one who shines brightest when asked really tough questions.
Then it occurred to me. Nora didn’t know what the word “gutter” meant. It is easy to forget that Nora is Bulgarian -- her English is that good. I asked her whether she knew what the word meant, and looking embarrassed, she replied that she didn’t. How do you explain what gutters are without using the word gutter? It’s not easy, at least not for me. So, I broke into pantomime, with my fingers simulating raindrops heading for a cliff where they were caught by an invisible gutter.
Suddenly, her face lit up, and she quickly answered my original question. But it had taken her longer than I would have expected, even adjusting for my pantomiming skills. Still puzzled, I asked her, “How do you say ‘gutter’ in Bulgarian?” She said she didn’t know. Amazed, I said, “You’re pulling my leg, right?” She wasn’t.
Are there gutters in Bulgaria? I don’t know; I’ve never been there. Everywhere I’ve lived, gutters are ubiquitous. Are they common elsewhere, or are they just an American thing?
One student disliked my treatment of Nora, saying on her evaluation of the class: "Something that bothered not only me but other students (and I know this from talking to my classmates) was the way Professor Harrington picked on the international students. We had about five international students in the class, and one day Professor Harrington did a problem about gutters. The student he asked to answer the question was Bulgarian and did not know what the word ”gutter” meant, and Professor Harrington made a big deal out of this. He asked her how you would say “gutter” in Bulgarian."
She says, “He continued to [quiz international students about their understanding of English] in other classes, singling out the international students and making them look inferior to the rest of the class.”
If the student had listened to the quality of her international classmates’ answers to my questions, she would have realized that they were academically superior to the vast majority of their classmates. Indeed, their median grade was 4.0; they all spoke English fluently; and, their essays had fewer grammatical errors than most of their classmates. It seems implausible to me that any rational observer would infer that they were inferior based on my questions about their knowledge of a few English words.
But even Nora looked embarrassed when she “confessed” that she didn’t know what gutters were. She had no reason to be embarrassed, yet she was. Why?
Perhaps, it has to do with the power of gut feelings, which allow people to quickly categorize experiences without having to think too deeply about them. Following them can even save your life in situations where you need to make quick decisions, implying that gut feelings are probably hard-wired into us via evolution. Hence, gut feelings probably can’t easily be turned off, implying that Nora could have been embarrassed by the gutters episode regardless of whether it was justified. And this is a shame -- because good class interactions should be full of professors and students going in any number of directions, some of them uncomfortable, without worrying about appearances or comfort levels (or whether some comment is going to make you a poster child for the Academic Bill of Rights).
I was in a gray area with Nora, one that I did not perceive as being gray until I thought about the comments of this student. I feel badly that I might have embarrassed Nora -- it was certainly not my intention. Nevertheless, asking Nora whether she knew the word for gutter in Bulgarian was the highlight of the course for me. My intuition screamed at me to ask it and her answer rewarded the impulse -- not because I was happy to discover that she didn’t know the word, but because it made me think more deeply about the way in which languages compete with one another for survival. Indeed, many languages face extinction because they are cluttered with words that people no longer find useful. For example, some languages have dozens and dozens of different words for ice, which may not be a selling point in the coming age of global warming.
Nobel laureate Robert Solow argues that the most difficult thing to teach students is how to be creative in economics, followed closely by critical judgment. It is much easier to teach tools, such as demand and supply, than how to use them creatively, or critically. The first step in using economics creatively is to ask interesting questions, ones that naturally arise during genuine conversations sparked by observing differences like those concerning the acquisition of language. While these conversations are crucial in teaching students to be creative, they are also likely to tumble into gray areas and sometimes produce dry holes, two things that make some students uncomfortable.
Another way to be creative in economics is to apply economic reasoning to topics commonly thought to lie outside the realm of economics. Hence, I want my students to learn that there are no boundaries to the usefulness of economic reasoning. I mean NO boundaries, absolutely none. Boundaries smother creativity because they encourage students to turn off their economic reasoning skills whenever they cross them.
Last semester, I described how a San Diego abortion cartel in the late 1940s charged women different prices depending on the quality of their clothing and the characteristics of the person accompanying them, a practice that economists call price discrimination. For example, a young woman who was brought to the clinic by an unrelated, well-dressed Sacramento businessman was charged $2,600 for an abortion. If the woman had come alone, she would have paid something closer to $200. Four students have come to my office or e-mailed me with concerns over the use of examples like this one. For example, one student argued that abortion is too morally charged to be used as fodder for examples, especially ones that are so narrowly drawn.
Crossing the border into conversations about race is especially dangerous, because the border is patrolled by guards searching for insensitive comments. It takes courage and tolerance on the part of both students and professors to have genuine conversations about race. However, no topic is more important to discuss in economics courses given the glaring disparities in economic outcomes between African-Americans and whites. For another course I teach, students are required to read an article about the controversy that erupted when members of one middle-class community proposed naming a “nice street” after Martin Luther King Jr. The proponents wanted to weaken the correlation of his name with poverty and crime, while the opponents feared that naming a street after him would cause their neighborhood to decay. I admire the proposal yet empathize with the opponents. Since streets bearing his name are more commonly found in poor neighborhoods, (even unprejudiced) people might rationally "steer clear" of the area if they name a street after Martin Luther King Jr., a phenomenon economists call statistical discrimination.
Teaching students to use economics creatively requires having conversations that are not smothered by fears of saying something wrong or of stepping over some boundary beyond which economic reasoning is prohibited. But genuine conversations require that students have done enough of the reading to participate with intelligence -- and checking on that may also make students uncomfortable.
A student last fall accused me in his or her course evaluation of picking on students, saying that “if it was obvious a student was unprepared or had not done the assigned reading [Professor Harrington] would call them out on it.” It’s true. I admit it. Failing to read the assigned articles imposes spillover costs on other students that can be corrected by imposing penalties on unprepared students. For example, one student could not answer straightforward questions about the readings in two consecutive classes, prompting me to ask him whether he had ever heard of the expression, “three strikes and you’re out.” At the beginning of the third class, he joined the conversation, easily answering my initial questions and making a few comments of his own.
David E. Harrington
David E. Harrington is the Himmelright associate professor of economics at Kenyon College.Â
A voice overhead in the Washington, D.C., metro system warns “customers” not to try to hold open the doors of a subway car as they are closing. The announcement is made every stop or two. You hear it at least a couple of times during each trip. Yet I am always taken aback. A spirit of usage crankiness kicks in -- my inner Edwin Newman -- to insist on the difference between being a passenger and a customer. The words aren’t mutually exclusive, of course, but why not use the one that applies to the concrete, particular circumstance of being in the mass transit system?
Not that complaining would do any good. The language will get mangled, irregardless. Besides, this is a case of usage reflecting an established, nearly ubiquitous attitude. Everything is a market, and everyone is now (in all ways and at all times) a consumer. Someone who pays taxes for public transportation is not so much a citizen as a customer, in exactly the same sense as folks in line at McDonald’s. Likewise with the student in a university classroom -- who, having paid good money, expects both a passing grade and a certain level of entertainment, and may not be shy about expressing these demands.
Perhaps it’s a cultural residue of the past few decades of “market utopianism,” to borrow an expression used by Lawrence D. Brown and Lawrence R. Jacobs in The Private Abuse of the Public Interest: Market Myths and Policy Muddles, just published by the University of Chicago Press. The authors are serious wonks (Brown is professor of health care policy and management at Columbia University, while Jacobs directs the Center for the Study of Politics and Government at the University of Minnesota) rather than testy guys muttering about the Zeitgeist while riding the subway. But their book, which is compact and jargon-free, is intended for ordinary readers trying to understand the limitations of free-market fundamentalism – including its clear tendency to backfire.
The book's timing is remarkable. At this point, not even Business Week is completely faithful to the doctrine that “markets are smart, government is dumb,” as former Republican House Majority Leader and onetime econ prof Richard Armey once put it. A recent cover of the magazine announced: “Washington’s new role in banking is just the beginning of the ‘public-private’ world to come.” The phrase “public-private” is printed in red, which one might interpret in a couple of ways -- either as a sign of creeping socialism, or because the global economy is swimming in that color of ink.
Dogmatic advocates of “the magic of market forces” suffer, according to Brown and Jacobs, from not having understood Adam Smith very well. “Far from offering an unqualified celebration of unrestrained self-aggrandizement,” they write, “Smith struggled to balance individual self-interest against the social need for institutions that harnessed self-regard to the service of society.” The discipline of the market is not enough to achieve that balance. The state must provide certain public goods. Market forces alone aren’t sufficient to meet the common need for national defense, rule of law, public education, and the maintenance of infrastructure for transportation.
Such services are “for the benefit of the whole society,” according to Adam Smith, and must be “defrayed by the general contribution of the whole society.” Which means taxes. (It seems a matter of time before Sarah Palin releases an attack ad regarding all the crypto-Marxism in The Wealth of Nations.)
Brown and Jacobs laud what they call Smith’s “pragmatic realism,” including his recognition of the need for state regulation of banking. But somewhere along the way, Smith’s notion of a balance between the play of private interest and the role of the state was turned into a zero-sum game -- a fervent antistatism, for which market competition was the ideal prescription for what ails us. For just about all our problems -- according to the “marketist” doctrine, anyway -- come from government programs or regulations.
Minimizing the role of the state clears the way for such market-induced virtues as “responsiveness to consumer preferences, competitive equilibration of supply and demand, and so on.” Plus smaller government means lower taxes -- which, in turn, reinforces smaller government.
It all sounds so perfect. And no one can say it has not been attempted. “States that attempted to unleash the magic of competition ended up costing consumers $292 billion in higher electricity prices between 2000 and 2007,” note Brown and Jacobs, including “$48 billion more than consumers paid in states that maintained traditional rate regulation from May 2006 and May 2007.” Bush-era initiatives for “managed competition” in education and health care had the perverse effect of increasing federal power over local school systems while adding “burgeoning regulatory clarifications and correctives as far as the eye can see” to Medicare.
Deregulation of the airlines reduced the price, and increased the convenience, of travel -- at least for a while. But now overbooking of planes, constant rescheduling, and the congestion of routes during peak hours point to the limits of competition.
Meanwhile, pro-market rhetoric never reduces the appetite for pork. “The growth of government is not mainly the work of profligate ‘tax and spend’ Democrats,” the authors point out. “Solidly among the spenders and promoters of government activism were the antistatists who controlled Washington in the early twenty-first century and, indeed, dominated policy debates and held the levers of power in Congress and the White House for three decades.”
The issue here is not philosophical inconsistency. The problem, as Brown and Jacobs understand it, is built into the tendency to frame the relationship between state and market forces as “either/or” instead of “both/and.” They trace a recurrent cycle in public policy over recent decades in which reforms are enacted to increase the role for markets and decrease government regulation. Then follow unintended consequences (higher prices, threats to public safety, breakdown of institutions) -- leading to calls for renewed regulation by state agencies.
But the public sector often proves overextended and underfunded. “All too often government disappoints expectations,” write Brown and Jacobs, “which fuels the rhetorical attacks of the state bashers and deepens the democratic disconnect.”
It leads to a situation the authors call “management by objection” in which “headlines scream, heads roll, band-aids adhere, and the cycle resumes....” Public policy consists of damage control. And that is always too little, too late. Thus it is that “the era’s reigning non sequitur --– if government is so bad, markets must be better -- begins to look axiomatic.”
“Pragmatic” appears to be the authors’ favorite word, with “realist” being a close second. “When politics is premised on a principled denial of the obvious,” they write, “government grows without vision, purpose, or a due concern for its capacities to serve the public.” The result is inefficiency and incoherence -- at best.
The panacea of deregulation leaves “political leaders and civil servants in obscure agencies scrambling to forestall market failures, repair the breakdown of services that the public expects, and respond to the complaints of concerned constituents,” according to Brown and Jacobs -- who presumably wrote this well before things started getting really bad. “Institutional realism should be introduced earlier and more prominently in discussions of policy reform.”
Well, okay -- that all seems fair enough, given the spirit of managerial centrism that pervades this book. But just where is the “institutional realism” supposed to come from?
The authors note that “the development of specialized, well-trained managers and officials equipped with thoughtfully articulated operating procedures and advanced information technologies” has lagged. Meanwhile, when things go wrong, “the public wants government to respond fast and well as is outraged if it dithers.” It does not sound like a promising alignment of circumstances.
The Private Abuse of the Public Interest closes with an expression of hope that recent events may “clear a new space for pragmatism in public policy.” So they might. But things will probably get worse before they get better.
For 26 years I have been an economist researching the pension system. Along the way I formulated some policy implications from the research -- which is the matter-of-fact job of a career academic in modern departments of economics. I have studied pensions my entire adult life. My 1984 dissertation from University of California at Berkeley was very uncool -- stagflation was the hot area. My work’s subtitle was, “Towards a National Retirement Income Security Policy.“
Fast forward two plus decades and I testify on October 7, 2008 in Congress about what should be done about the nation’s eroding retirement income programs. I was invited after an op-ed of mine on the subject ran in The New York Times on September 27, 2008.
My testimony dealt mostly with 401(k) plans, but also other defined contribution plans like those offered by TIAA-CREF. Some of those plans were declining sharply, some between 20 to 50 percent, because of the market collapse and people’s retirement dreams were evaporating in the worst labor market in 20 years. My book -- When I’m Sixty-Four: The Plot against Pensions and the Plan to Save Them (Princeton University Press, 2008) -- had just been published and I was in Congress telling legislators what should be done to bring stability to our nation’s troubled retirement system. A policy economist’s dream. No? Yes -- but only if the academic understands the new forces of gravity caused by the blogosphere, coupled with the power of a fierce presidential election, the anxiety generated by a crashing economy, and the ordinary force of the lobbying efforts of a well-established industry sector -- in this case the 401(k) industry.
I learned something that you don’t learn as a professor. If you are going to question a long-standing profitable tax break for a powerful industry, get an E-bay account and bid on a “thick skin.” In the weeks following my testimony I received more than 20 e-mails a day (most I answered) and about that many Google Alerts. Sen. John McCain alluded to my testimony at campaign rallies in the last days of his failing bid for the White House. I was interviewed by many legitimate reporters. A fraction of the e-mails I received were respectful -- all were filled with fear. The least offensive of the non-respectful e-mails are similar to the one I reprint below, but none were as funny. Most were obscene and threatening.
From: ADAM H---------
To: Teresa Ghilarducci
Sent: Nov 7, 2008 4:47 PM
Subject: 401K policy
Dear Ms. Ghilardt:
I think your Socialist ideas regarding 401K plans are absolute trash.
Get a f****** (asterisks added) new hairstylist.
I wrote back:
Wait. I spend a lot of money on my hair. Maybe too much? What's the matter with my hair? At least my plan is better. I want people to have access to a safe place to save their retirement money. My plan calls for 401(k)s to exist alongside a government program that lets people save in a system similar to what members of Congress and other federal employees' have. You’ll get better and safer returns than most people get with their 401(k)s.
What's wrong with that? The hair is a separate issue.
Three weeks after my testimony, and a week before the election, I got my first clue about where the buzz was being created about my plan. I went to a well-organized, exciting conference on Life-Cycle Saving at Boston University. Zvi Bodie, the nation’s leading finance economist, gathered industry leaders and academics to discuss issues in American’s retirement income security system. I sat at the table of gracious, well-dressed, and extremely knowledgeable financial industry executives, people I have grown comfortable with during my stints as a pension trustee. They stunned me by asking if my ears were burning, because I was much discussed at a previous week’s conference on 401(k) plans. What happened was that Rush Limbaugh had given a garbled version of my testimony on his Web site (complete with my photograph). In my book (and in Congress) I proposed the government set up a new plan, which would supplement Social Security, an additional place Americans could save for their retirement.
Only 50 percent of workers have pensions at work. This rate of coverage has been stagnant since the 1970s. Many people don’t know how hard it is to save. In order for an average earner to supplement Social Security benefits at the most basic level, she would have to save 5 percent out of every paycheck for 40 years. Because it is hard, I proposed the government give a tax credit of $600 (indexed for inflation) for everyone towards his or her contribution. 401(k) plans would still exist. But the truth didn’t stand a chance in the hyper-desperate time around the presidential election. The ire of the industry came when I proposed to pay for the tax credits by scaling back dramatically the tax deduction for 401(k) plans, deductions that were expanded greatly under the Bush administration. Without the tax deduction the 401(k) industry knew it would have to lower fees and provide a better, safer, product, and that is when it started a full-court media blitz against me.
I made several mistakes. I had adopted the habit of a teacher who answers any request for knowledge. The University of Notre Dame -- where I taught for 25 years -- encouraged us to respond to all media inquires. Also, over the summer, I eagerly accepted all requests to be interviewed about my new book. In mid-October, I agreed to two live radio interviews (and didn’t check out their Fox affiliations nor the style of the show). In one interview the host thanked me for my time and after I hung up, he told his national audience he hoped I would stop ruining his country. Another host asked me if I wanted to change the tax deduction into a tax credit (a tax deduction means that the higher the tax bracket you are in the larger your tax subsidy is). A person in the 39 percent bracket gets 39 cents from the government for every dollar saved in a 401(k) and a person in the 15 percent tax bracket gets 15 cents). I said, ironically, I wanted to spread the wealth. Radio hosts don’t do well with irony – and so my play on the debate over what Barack Obama had told Joe the Plumber was largely missed by the public. Then I got a moniker from a blogger on “Capital Commerce “ which prompted my worried 71 year-old mother to call from California. Blogger James Pethokoukis identified me as "401(k) Foe Teresa Ghilarducci, the Most Dangerous Woman in America." (Yes, I am having a bit of fun with the "most dangerous" tag.)
But the legitimate press got the story right and called the discussion of what I proposed an “Urban Myth.” Here is reporting from MSNBC Nov. 7, 2008 John W. Schoen:
"Hearing on 401(k) plan grows to urban legend (MSM preparing people for government seizure of 401K!) There is no proposal in Congress to take away your 401(k) savings account. In any case, the stock market has already done a pretty good job of wiping out several trillion dollars worth of 401(k) savings without any help from Congress. At that hearing, one of the witnesses, Teresa Ghilarducci, an economics professor at The New School for Social Research, made an interesting observation…. The government spends as much as $80 billion a year in tax breaks to subsidize 401(k) savings plans. In her opinion, that money could be better spent offering a tax credit for a revised retirement plan that would guarantee a minimum income stream to people who saved for retirement So maybe it's not such a bad idea to start listening to new ideas."
Aren’t new ideas what academics are supposed to come up with?
Even after the election I am still material for radio entertainer Rush Limbaugh, who among many roles often poses as a policy wonk.
RUSH: Do you know what's going to happen to you? We don't know what's going to happen, but do you know what the Democrat plan for your 401(k) is?
CALLER: I believe it has something to do with circling the bowl.
RUSH: (laughing) Circling the bowl. You mean like flush it?
RUSH: But seriously, how much do you know about it? You've just heard about it, or you want me to repeat what you know to other people?
CALLER: Please repeat, because like I said, since I knew I wasn't going to vote for Obama, I said, "I don't need to worry about it."
RUSH: Let me give it to you very briefly. So far, this is not Obama yet, but this goes straight to my point about all of the idiots on our side |. So one of the big incentives for having a 401(k) came under assault. Then that same committee two weeks later brought in an economist from the New School in New York called Teresa Ghilarducci. I'm having trouble with her name, and not on purpose but her idea is even worse, Darcy. She wants to basically eliminate the 401(k), …. “
But, if the right wing pays enough attention, the mainstream media will begin to correct some of the blogosphere’s exaggerations.
I am grateful that veteran reporter, Robert Powell, for MarketWatch, wrote that Barack Obama must fix the nation's retirement system.
“It's starting to look like a train wreck of immense proportions. The government will be spending billions of dollars in the decades to come bailing out average Americans who don't have enough set aside to pay for basic living expenses if something isn't done now. What is that something? Ghilarducci suggests combining the best features of a 401(k) plan with the best features of traditional pension plans to create what she calls a guaranteed retirement account (GRA), a type of cash-balance pension plan (that many employers now offer) or sovereign wealth fund.”
Fortunately for me, the president of my university, Bob Kerrey, is a public figure and a former policy maker himself, so he is well used to this sort of treatment and is more than supportive. So what will I do the next time I am called to testify? My footnotes of supporting studies will move to the text -- otherwise my views look isolated and can be picked off like a young zebra separated from the pack. I will talk to Fox only on my terms and I will be fully prepared for an attack when I venture into the blazing sun of the blogsphere and CSPAN. I will continue to publish peer-reviewed research, to be sure; but,next time, before I head out for a wild ride from refereed journals to Rush Limbaugh I’ll have an arsenal of spurs and switches.
Teresa Ghilarducci holds the Bernard L. and Irene Schwartz Chair of Economic Policy Analysis at the New School for Social Research.
The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.
Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”
In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”
There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.
This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.
Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?
A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.
The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.
The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.
The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.
The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.
To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.
The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.
Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.
Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?
What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.
Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.
Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.
The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.
Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.
Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.
But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.
Gabriel Paquette is a lecturer in the history department at Harvard University.
On April 15, tens of thousands of people attended “tea parties” to denounce Obama’s economic policies – dressed up, some of the protesters were, like refugees from a disaster at a Colonial theme park. “No taxation without representation!” they demanded, having evidently hibernated through the recent election cycle. The right-wing publicity machine dutifully ground out its message that a mass movement was being born.
Suppose we grant the claim (however generous, however imaginative) that the tea parties drew 250,000 supporters. Compare that with the turnout, not quite three years ago, for the “Day Without an Immigrant” rallies, which involved somewhere between 1 and 1.5 million workers – many of them undocumented, which meant that their decision to attend involved some risk of losing a job or being deported. By contrast, last week’s anti-Obama protest made no real demands on its participants, and came after weeks of free and constant publicity by a major television network. Teabaggery also enjoyed the support of prominent figures in the conservative establishment. Yet with all this backing, the entire nationwide turnout for the tea parties involved fewer people than attended the immigrant rallies in a single large city.
The events of April 15 may not have marked the death agonies of the Republican Party. But they certainly amounted to a case of profound rhetorical failure: a moment when old modes of persuasion lost their power. The claim to speak for the concerns of “ordinary Americans” choked on its own pseudo-populist bile. The tea bags were less memorable than the cracked pots. It was hard to watch the footage without thinking that the next Timothy McVeigh must be a face in the crowd – and wondering if his victims ought to bring a class-action suit against Fox News.
Only just so much of the failure of the teabagging movement can be attributed to its instigators’ unfamiliarity with contemporary slang. A new book from the University of Chicago Press helps to clarify why alarmist denunciations of higher taxation and (shudder!) “redistribution of the wealth” just won’t cut it.
The publication for Class War? What Americans Really Think About Economic Inequality by Benjamin I. Page and Lawrence R. Jacobs could not be better timed. Page is a professor of political science at Northwestern University, while Jacobs directs the Center for the Study of Politics and Governance at the University of Minnesota. The authors conducted a national public-opinion survey during the summer of 2007 – just before the global economic spasms started – and they also draw on several decades’ worth of polling data in framing their analysis.
The question mark in the title is no accident. Page and Jacobs are not radicals. They insist that there is no class war in the United States. (This, in spite of quoting Warren Buffett’s remark that there actually is one, and that his class has been winning.) They provide evidence that “even Democrats and lower-income workers harbor rather conservative views about free enterprise, the value of material incentives to motivate work, individual self-reliance, and a generalized suspicion of government waste and unresponsiveness.” Their survey found that 58 percent of Democrats and 62 percent of low-income earners agreed that “large differences in pay are probably necessary to get people to work hard.”
But at the same time, they report a widespread concern that the gap between extremes of wealth and poverty is growing and poses a danger. “Although Americans accept the idea that unequal pay motivates hard work,” they find, “a solid majority (59 percent) disagree with the proposition that large differences in income are ‘necessary for America’s prosperity.’”
Not quite three quarters of those polled agreed that “differences in income in America are too large,” and more than thirds reject the idea that “the current distribution of money and wealth is ‘fair.’ ” The proposition that “the money and wealth in this country should be more evenly distributed among a larger percentage of the people” was supported by a large majority of respondents.
While inequality may sound like a Democratic talking point (at least during campaign seasons), the authors note that “solid majorities of Republicans (56 percent) and of high income earners (60 percent) agree that income differences are ‘too large’ in the United States. ... Majorities of Republicans (52 percent) and of the affluent (51 percent) favor more evenly distributing money and wealth.” A footnote indicates that the category of “high income” or “affluent” applied to “the 25.2 percent of our respondents who reported family incomes of $80,000 or more per year.”
While informed sources tell me that sales of small left-wing newspapers are up lately, Page and Jacobs are doubtless correct to describe the default setting of American public opinion as a kind of “conservative egalitarianism.” Citizens “want opportunities for economic success,” they write, “and want individuals to take care of themselves when possible. But they also want genuine opportunity for themselves and others, and a measure of economic security to pursue opportunity and to insure themselves and their neighbors against disasters beyond their control.”
And to make this possible, they are reconciled to taxation. “There is not in fact a groundswell of sentiment for cutting taxes. When asked about tax levels in general, only a small minority favored lowering them; most wanted to keep them about the same. Asked to chose among a range of estate-tax rates on very large ($100 million) estates, only a very small minority of Americans – just 13 percent of them – picked a rate of zero. The average American favors an estate-tax range of about 25 percent. ... Most American say the government should rely a lot on taxes they see as progressive, like corporate income taxes, rather than on regressive measures like payroll taxes. To our surprise, a majority of Americans even say that our government should ‘redistribute wealth by heavy taxes on the rich,’ a sentiment that has grown markedly over the past seventy years.”
And all of this data was gathered, mind you, well before jobs, housing, and retirement savings began to vaporize.
Nothing in Class War? quite answers the question of what political consequences logically follow from the polling data. Perhaps none do, in particular. What people want (or say that they want) is notoriously distinct from what they will actually bestir themselves to do. But it’s worth noting that Page and Jacobs found broad support for increasing the pay of low-income jobs, and drastically reducing the income of those who earn a lot.
“Sales clerks and factory workers should earn $5,000 more a year (about 23 percent more), according to the median responses of those we interviewed,” they write. At the same people, people “want to cut the income of corporate titans by more than half – from the perceived $500,000 to a desired $200,000. Imagine the reaction of ordinary working Americans if they learned that the CEOs of major national corporations actually pulled in $14 million a year.” Yes, imagine. Then something other than tea might start brewing.
Adolescent exposure to Ayn Rand’s work tends either to convert you to her philosophy of Objectivism or to inoculate you against it. The intensity and depth of the conversion experience vary from person to person. Not everyone can handle the rigors of a totalist system requiring adherents to accept not just laissez faire economics (that’s the easy part) but the full Randian synthesis of ethics, aesthetics, epistemology, and history. There is also a kind of Objectivist psychotherapy, serving to cure altruism and related failings of character.
And so you may approach, without ever quite hoping to achieve, the state of perfect selfishness embodied in John Galt, the mysterious hero of Atlas Shrugged. Once upon this path, you will understand why the seemingly mild-mannered Immanuel Kant was, in fact, an incredibly sinister figure, which spares you the trouble (and it really is trouble) of reading him.
The full course of Randian thought-reform is itself quite demanding, however. Most conversions to Rand’s worldview prove halfhearted. Many are called, but few are Galtian. The world, or at least the United States, is full of people who remember the novels fondly, and vote Republican, while otherwise falling short of the glory. Rand would have scorned them. She was good at scorn, and hardcore Objectivists get a lot of practice at it as well.
But her fans -- as distinct from her followers, sometimes called Randroids, though never by each other -- form the real constituency for the "Atlas Shrugged" movie now in theaters. It is only the first of two or three parts. Whether the project will be finished appears to be a matter of debate among the moviemakers themselves. Clearly, though, it's going over well with its intended market, to judge by the Twitterchat hailing it as one of the great films of all time. And when I saw it in New York this weekend, the audience clapped at the end, as the credits began to roll.
By that point, my capacity for disbelief had been tested quite enough for one evening; the applause seemed one challenge to it too many. The problem with this incarnation of Atlas Shrugged is not ideology but competence. The film looks cheap. Its cinematography is at roughly the level of a TV show from the 1980s. Rand’s plot is almost operatic in its indifference to plausibility, but none of the cast is up to the challenge. (Even with the lead characters, Hank Rearden and Dagny Taggart, some part of the actors' brains seemed busy checking their iPhones, perhaps to see if that dinner-theater gig came through.)
The film, or rather this installment of it, culminates in the triumphant run of the John Galt Railroad through Colorado, traveling at hundreds of miles per hour over rails fabricated from the surprisingly controversial Rearden Metal. The State Science Institute has issued dire warnings about Rearden Metal. The entire country stops whatever it is doing just to watch this event on television. Pundits on several continents write editorials denouncing the folly of such boldness. The stakes are enormous, for the mighty train is a symbol of the indomitable individual against collectivist tyranny. Either that or the tracks made of Rearden Metal are. Possibly it's both. Anyway, the climax, when it comes, possesses all the grandeur of an Amtrak commercial.
Any audience willing to pay $13 to watch "Atlas Shrugged" at the late screening on a Saturday night will be self-selecting for Randian enthusiasm, of course. People weren’t clapping for the movie, as such. They were applauding Rand’s weltanschauung. She was a genius, which more than makes up for the talent deficit of everyone else involved in the film. My objections are just the gripes of a Marxist who wants his money back.
But in truth, Rand and her work intrigue me. The initial exposure did not yield conversion, by any means, but the inoculation was imperfect. Something about her is fascinating. She is one of the great pulp writers, like Jim Thompson or Richard Shaver. At the same time, her fusion of melodrama and ideology is quite distinctive. I think of Rand (who was an anti-Communist émigré from Russia) as a profoundly Soviet author -- albeit one standing on her head.
In Atlas Shrugged, the greedy proletariat ruthlessly exploits the capitalists. The oppressed capitalists go on strike, then create a utopia under the leadership of John Galt. (In a socialist-realist “production novel” of the 1930s, Galt's analog would be the “positive hero” who grasps the direction of history and provides wise leadership.) The existence of a body of Objectivist scholarship interested me enough to write a long article about Rand for Lingua Franca, some while ago; and I still take an occasional look at the secondary literature on Rand.
But more to the point, "Atlas Shrugged" on screen is disappointing to me because it falls so far short of the movie version of "The Fountainhead" from 1948. Rand had a great deal of say in how that film was made. She did not like the result, but at no point in her life was Rand easy to please. It belongs in the class of films I always watch whenever rerun on television, along with "Psycho," anything with the Marx Brothers, and "Night of the Living Dead." (Make of that list what you will.)
The smoldering glances from Patricia Neal after she sees Gary Cooper and the mighty jackhammer he wields are a lesson in pure cinema:
More talent is concentrated in that clip (including the command of visual metaphor) than can be found in the whole of "Atlas Shrugged."
To put the dud now on screen into perspective, it helps to read Jeff Britting’s paper “Adapting Atlas Shrugged to Film," which appears in Essays on Ayn Rand's 'Atlas Shrugged,' edited by Robert Mayhew and published by Lexington Books in 2009. Britting is the archivist in charge of the author’s papers, held by the Ayn Rand Institute in Irvine, Calif., and he was an associate producer of “Ayn Rand: A Sense of Life,” which received an Oscar nomination for best documentary in 1998.
"Atlas Shrugged" may hold the all-time record for time spent in that realm of Hollywood called “development hell.” The possibility of bringing the novel to screen came up not long after it was published in 1957. Britting draws on “items found among her personal papers, interviews [with] or written statements by Rand, [and] oral histories conducted with people associated with historic efforts to produce a film version of her novel” during Rand’s lifetime. The author was never going to entrust her masterpiece to any other screenwriter, and her papers include several adaptations at various stages of completion. They include a proposed nine-hour TV miniseries and a four-hour theatrical release, in two parts, as well as shorter versions in each medium.
Britting quotes the producer Michael Jaffe, who worked on one effort to put Atlas Shrugged on television, about the standoff between Hollywood and Rand: “The reputation is that her stories are too idea-filled to make into films; if she had stayed out of it and let them just make the movies, take the best of the plot and not be whipsawed by all the philosophy, they’d be great stories. But it was the whipsawing that always killed it…. The people who controlled the rights to her stories would never let you just go out and make the movie.”
But the distinction between story and idea is not valid for an Objectivist. Britting quotes Rand’s definition of plot as “purposeful progression of logically connected events leading to the resolution of a climax.” The actions and choices driving those events reflect the characters’ values; she defines value as “that which one acts to gain and/or keep.” So while it is true that Rand’s characters are prone to giving one another long lectures, her message is embedded in what they do as well as what they say. Her drafts show the author striving to pare down the dialog and remove secondary characters from Atlas Shrugged -- meanwhile reinforcing its plot as essential expression of her ideas on screen.
During work on one adaptation, she timed the speech that John Galt delivers to the world by radio at the end of the novel. This, for the true admirer, it is one of the greatest pieces of literature and philosophy of all time, and Rand herself would not have disputed the matter. It is a comprehensive statement on the morality of capitalism, the virtue of selfishness, and the absolute evil of interfering with the ordained perfection of the free market. In later nonfiction writings, Rand even took to quoting Galt’s speech as if he were an authority she were citing. (I find this a little creepy.)
The 60 pages or so of Galt's radio broadcast took four hours to read out loud, which would be long for cable access, let alone network TV. But Rand told her producer not to worry: “I will get the speech down to three to seven minutes. I’ll have to do so; no one else is equipped to do that.” Finding the “dramatic equivalent” of parts of Galt’s argument would allow Rand to express her (his?) ideas without taking four hours to do so. This required what she called “dancing back and forth … between abstractions and concretes.”
Screen adaptation, then, is for Rand a late phase of the creative process: a means of preserving the philosophical elements of a plot while responding to a different medium and new circumstances. And with that in mind, the conclusion of Britting’s essay sounds like a criticism of the new film -- except that it was published two years ago.
Anyone adapting Atlas Shrugged today, he writes, “must put down the book and look out at the world, totally on his own -- while taking stock of his own experience -- in order to begin dancing, as observes Rand, ‘literally’ between the novel’s abstract philosophy and its concretes.”
Instead, the makers of the new film have just taken Rand’s story from five decades ago, trimmed it down a bit, and added cell phones and CNN.
Rand set her novel in a vaguely not-too-distant future. But it’s really her dystopian reimagining of the New Deal era. It pictures an America in which the economy is based on industrial production, but menaced by powerful labor unions and legislators eager to regulate businesses. In it, citizens get caught up in feverish debate over the opening of a new railroad, built with an exciting and mysterious new metal.
By 1957, this was already somewhat anachronistic. Today it’s just surreal. Manufacturing accounts for half the gross domestic product it did 40 years ago, unions are in trouble, and regulation means that corporations pay a fine and write it off. Exciting technological developments typically do not involve the railroad industry.
It’s hard to imagine how even the most skillful Randian dancer could turn Atlas Shrugged into a 21st-century story. Maybe make Hank Rearden a bioengineer who’s figured out how to integrate people’s genomes with Facebook? Dagny Taggart might run a company that trades risky but extremely profitable financial instruments based on how many people a company puts out of work when it moves from country to country. And it could end with John Galt planting a microchip programmed with his philosophy into everyone’s brains.
Admittedly, this all sounds preposterous, but it couldn’t be worse than the movie now in theaters.