Are students evaluated on their academic work, or on how well they navigate the college environment? Both, a recent book argues -- which is why mentoring programs should aim to unmask the "hidden curriculum" for at-risk students.
Many a thick academic tome turns out to be a journal article wearing a fat suit. So all due credit to Anna M. Young, whose Prophets, Gurus, and Pundits: Rhetorical Styles and Public Engagement was published by Southern Illinois University Press this year. Her premise is sound; her line of argument looks promising; and she gets right to work without the rigmarole associated with what someone once described as the scholarly, “Yes, I read that one too” tic.
Indeed, several quite good papers could be written exploring the implicit or underdeveloped aspects of her approach to the role and the rhetoric of the public intellectual. Young is an associate professor of communication at Pacific Lutheran University, in Tacoma, Washington. Much of the book is extremely contemporary in emphasis (to a fault, really, just to get my complaint about it out front here). But the issue it explores goes back at least to ancient Rome -- quite a while before C. Wright Mills got around to coining the expression “public intellectual” in 1958, in any case.
The matter in question emerges in Cicero’s dialogue De Oratore, where Young finds discussed a basic problem in public life, then and now. Cicero, or his stand-in character anyway, states that for someone who wants to contribute to the public discussion of important matters, “knowledge of a vast number of things is necessary, without which volubility of words is empty and ridiculous.”
On the other hand -- as Cicero has a different character point out -- mere possession of learning, however deep and wide, is no guarantee of being able to communicate that learning to others. (The point will not be lost on those of you surreptitiously reading this column on your mobile phones at a conference.)
Nobody “can be eloquent on a subject that he does not understand,” says Cicero. Yet even “if he understands a subject ever so well, but is ignorant of how to form and polish his speech, he cannot express himself eloquently about what he does understand.”
And so what is required is the supplementary form of knowledge called rhetoric. The field had its detractors well before Cicero came along. But rhetoric as defined by Aristotle referred not to elegant and flowery bullshit but rather to the art of making cogent and persuasive arguments.
Rhetoric taught how to convey information, ideas, and attitudes by selecting the right words, in the right order, to deliver in a manner appropriate to a particular audience -- thereby convincing it of an argument, generally as a step toward moving it to take a given action or come to a certain judgment or decision. The ancient treatises contain not a little of what would later count as psychology and sociology, and modern rhetorical theory extends its interdisciplinary mandate beyond the study of speech, into all other forms of media. But in its applied form, rhetoric continues to be a skill of skills – the art of using and coordinating a number of registers of communication at the same time: determining the vocabulary, gestures, tone and volume of voice, and so on best-suited to message and audience.
When the expression “public intellectual” was revived by Russell Jacoby in the late 1980s, it served in large part to express unhappiness with the rhetorical obtuseness of academics, particularly in the humanities and social sciences. The frustration was not usually expressed quite that way. It instead took the form of a complaint that intellectuals were selling their birthright as engaged social and cultural critics in exchange for the mess of pottage known as tenure. It left them stuck in niches of hyperspecialized expertise. There they cultivated insular concerns and leaden prose styles, as well as inexplicable delusions of political relevance.
The public intellectual was a negation of all of this. He or she was a free-range generalist who wrote accessibly, and could sometimes be heard on National Public Radio. In select cases the public intellectual was known to Charlie Rose by first name.
I use the past tense here but would prefer to give the term a subscript: The public intellectual model ca. 1990 was understood to operate largely or even entirely outside academe, but that changed over the following decade, as the most prominent examples of the public intellectual tended to be full-time professors, such as Cornel West and Martha Nussbaum, or at least to teach occasionally, like Judge Richard Posner, a senior lecturer in law at the University of Chicago.
And while the category continues to be defined to some degree by contrast with certain tried-and-true caricatures of academic sensibility, the 2014 model of the public intellectual can hardly be said to have resisted the blandishments of academe. The danger of succumbing to the desire for tenure is hardly the issue it once might have seemed.
Professor Young’s guiding insight is that public intellectuals might well reward study through rhetorical analysis -- with particular emphasis on aspects that would tend to be missed otherwise. They come together under the heading “style.” She does not mean the diction and syntax of their sentences, whether written or spoken, but rather style of demeanor, comportment, and personality (or what’s publicly visible of it).
Style in Young’s account includes what might be called discursive tact. Among other things it includes the gift of knowing how and when to stop talking, and even to listen to another person’s questions attentively enough to clarify, and even to answer them. The author also discusses the “physiological style” of various public intellectuals – an unfortunate coinage (my first guess was that it had something to do with metabolism) that refers mostly to how they dress.
A public intellectual, then, has mastered the elements of style that the “traditional intellectual” (meaning, for the most part, the professorial sort) typically does not. The public perceives the academic “to be a failure of rhetorical style in reaching the public. He is dressed inappropriately. She carries herself strangely. He describes ideas in ways we cannot understand. She holds the floor too long and seems to find herself very self-important.” (That last sentence is problematic in that a besetting vice of the self-important that they do not find themselves self-important; if they did, they’d probably dial it down a bit.)
Now, generations of satirical novels about university life have made clear that the very things Young regards as lapses of style are, in fact, perfectly sensible and effective rhetorical moves on their own terms. (The professor who wears the same argyle sweater year-round has at least persuaded you that he would rather think about the possible influence of the Scottish Enlightenment on The Federalist Papers than the admittedly large holes.)
But she longs for a more inclusive and democratic mode of engagement of scholarship with the public – and of the public with ideas and information it needs. To that end, Young identifies a number of public-intellectual character types that seem to her exemplary and effective. “At different times,” she writes, “and in different cultural milieus, different rhetorical styles emerge as particularly relevant, powerful, and persuasive.” And by Young’s count, six of them prevail in America at present: Prophet, Guru, Sustainer, Pundit, Narrator, and Scientist.
“The Prophet is called by a higher power at a time of crisis to judge sinners in the community and outline a path of redemption. The Guru is the teacher who gains a following of disciples and leads them to enlightenment. The Sustainer innovates products and processes that sustain natural, social, and political environments. The Pundit is a subject expert who discusses the issues of the day in a more superficial way via the mass media. The Narrator weaves experiences with context, creating relationships between event and communities and offering a form of evidence that flies below the radar in order to provide access to information.” Finally, the Scientist “rhetorically constructs his or her project as one that answers questions that have plagued humankind since the beginnings….”
The list is presumably not meant to be exhaustive, but Young finds examples of people working successfully in each mode. Next week we'll take a look at what the schema implies -- and at the grounds for thinking of each style as successful.
A documentary on prison gangs from a few years ago included an interview with a member of the Aryan Brotherhood about his beliefs, though one could easily guess at them at first sight. It is true that the swastika is an ancient symbol, appearing in a number of cultures, having various meanings. As a tattoo, however, it very rarely functions as a good-luck sign or evidence of Buddhist piety. (Well, not for the last 70 years anyway.)
But this Aryan Brotherhood spokesman wanted to make one thing clear: He was not a racist. He didn’t hate anybody! (Nobody who hadn’t earned his hate, anyway.) He simply believed in white separatism as a form of multicultural identity politics. I paraphrase somewhat, but that was the gist of it, and he seemed genuinely aggrieved that anyone could think otherwise. He was, to his own way of thinking, the victim of a hurtful stereotype. People hear “Aryan Brotherhood” and get all hung up on the first word, completely overlooking the “brotherhood” part.
The interviewer did not press the matter, which seemed wise, even with prison guards around. Arguing semantics in such cases accomplishes very little -- and as Stephen Eric Bronner argues in his new book, The Bigot: Why Prejudice Persists (Yale University Press), the bigot is even more driven by self-pity and the need for self-exculpation than by hatred or fear.
“To elude his real condition,” writes Bronner, a professor of political science at Rutgers University, “to put his prejudices beyond criticism and change, is the purpose behind his presentation of self…. But he is always anxious. The bigot has the nagging intuition that he is not making sense, or, at least, that he cannot convince his critics that he is. And this leaves him prone to violence.”
Reminiscent of earlier studies of “the authoritarian personality” or “the true believer,” Bronner combines psychological and social perspectives on the bigot’s predicament: rage and contempt toward the “other” (those of a different ethnicity, religion, sexuality, etc.) is the response of a rigid yet fragile ego to a world characterized not only by frenetic change but by the demands of the “other” for equality. Bronner is the author of a number of other books I've admired, including Of Critical Theory and Its Theorists (originally published in 1994 and reissued by Routledge in 2002) and Blood in the Sand: Imperial Ambitions, Right-Wing Fantasies, and the Erosion of American Democracy (University Press of Kentucky, 2005), so I was glad to be able to pose a few questions to him about his new book by email. A transcript of the exchange follows.
Q: You've taught a course on bigotry for many years, and your book seems to be closely connected -- for example, the list of books and films you recommend in an appendix seem like something developed over many a syllabus. Was it? Is the book itself taken from your lectures?
A:The Bigot was inspired by the interests of my students and my courses on prejudice. Though it isn’t based on the lectures, I tried to organize it in a rigorous way. As Marx once put the matter; the argument rises “from the abstract to the concrete.”
The book starts with a phenomenological depiction of the bigot that highlights his fear of modernity and the rebellion of the Other against the traditional society in which his identity was affirmed and his material privileges were secured. I then discuss the (unconscious) psychological temptations offered by mythological thinking, conspiracy fetishism and fanaticism that secure his prejudices from criticism. Next I investigate the bigot’s presentation of self in everyday life as a true believer, an elitist, and a chauvinist.
All of these social roles fit into my political analysis of the bigot today who (even as a European neo-fascist or a member of the Tea Party) uses the language of liberty to pursue policies that disadvantage the targets of his hatred. The suggested readings in the appendix help frame the new forms of solidarity and resistance that I try to sketch.
Q: On the one hand there are various forms of bigotry, focused on hostility around race, gender, sexuality, religion, etc. But you stress how they tend to overlap and combine. How important a difference is there between "targeted" prejudice and "superbigotry," so to speak.
A: Prejudice comes in what I call “clusters.” The bigot is usually not simply a racist but an anti-Semite and a sexist (unless he is a Jew or a woman) and generally he has much to say about immigrants, gays, and various ethnicities. But each prejudice identifies the Other with fixed and immutable traits.
Myths, stereotypes, and pre-reflective assumptions serve to justify the bigot’s assertions. Gays are sexually rapacious; Latinos are lazy; and women are hysterical – they are just like that and nothing can change them. But the intensity of the bigot’s prejudice can vary – with fanaticism always a real possibility. His fears and hatreds tend to worsen in worsening economic circumstances, his stereotypes can prove contradictory, and his targets are usually chosen depending upon the context.
Simmering anti-immigrant sentiments exploded in the United States after the financial collapse of 2007-8; Anti-Semites condemned Jews as both capitalists and revolutionaries, super-intelligent yet culturally inferior; cultish yet cosmopolitan; and now Arabs have supplanted Jews as targets for contemporary neo-fascists in Europe. The point ultimately is that bigotry is about the bigot, not the target of his hatred
Q: You've written a lot about the Frankfurt School, whose analyses of authoritarianism in Germany and the U.S. have clearly influenced your thinking. You also draw on Jean-Paul Sartre's writings on anti-Semitism and, in his book on Jean Genet, homophobia. Most of that work was published at least 60 years ago. Is there anything significantly different about more recent manifestations of prejudice that earlier approaches didn't address? Or does continuity prevail?
A: Aside from their extraordinary erudition, what I prize in the Frankfurt School and figures like Sartre or Foucault is their intellectual rigor and their unrelenting critical reflexivity. I developed my framework through blending the insights of idealism, existentialism, Marxism, and the Frankfurt School. Other thinkers came into play for me as well. In general, however, I like to think that I too proceeded in relatively rigorous and critical fashion.
In keeping with poststructuralist fashions, and preoccupations with ever more specific understandings of identity, there has been a tendency to highlight what is unique and about particular forms of prejudice predicated on race, religion, gender, ethnicity, and the like. The Bigot offers a different approach, but then, most writers are prisoners of their traditions — though, insofar as they maintain their critical intellect, they rattle the cages.
Q: Much of the public understands “bigot” or "racist" mainly as insults, so that the most improbable folks get offended at being so labeled. People hold up pictures of Obama as a witchdoctor with a bone through his nose, yet insist that he's the one who's a racist. Sometimes it's just hypocrisy, pure and simple, but could there be more to it than that? How do you understand all of this?
A: Using the language of liberty to justify policies that disadvantage woman, gays, and people of color cynically enables him to fit into a changed cultural and political climate. It is also not merely a matter of the bigot demeaning the target of his prejudice but in presenting himself as the aggrieved party. That purpose is helped by (often unconscious) psychological projection of the bigot’s desires, hatreds, and activities upon the Other.
The persecuted is thereby turned into the oppressor and the oppressor into the persecuted. The bigot’s self-image is mired in such projection. "Birth of a Nation" (1915) -- the classic film directed by D.W. Griffith that celebrates the rise of the KKK -- obsesses over visions of freed black slaves raping white women, even though it was actually white slave owners and their henchmen who were engaged in raping black slave women.
In Europe during the 1920s and 1930s, similarly, anti-Semitic fascists accused Jews of engaging in murder and conspiracy even while their own conspiratorial organizations like the Thule Society in Germany and the Cagoulards in France were, in fact, inciting violence and planning assassinations. Such projection alleviates whatever guilt the bigot might feel and justifies him in performing actions that he merely assumes are being performed by his avowed enemy. Perceiving the threat posed by the Other, and acting accordingly, the bigot thereby becomes the hero of his own drama.
Q: Is there any reason to think prejudice can be "cured" while still at the stage of a delimited and targeted sort of hostility, rather than a full-blown worldview?
A: Fighting the bigot is a labor of Sisyphus. No political or economic reform is secure and no cultural advance is safe from the bigot, who is always fighting on many fronts at once: the anthropological, the psychological, the social, and the political. The bigot appears in one arena only to disappear and then reappear in another.
He remains steadfast in defending the good old days that never were quite so good – especially for the victims of his prejudice. Old wounds continue to fester, old memories continue to haunt the present, and old rumors will be carried into the future. New forms of bigotry will also become evident as new victims currently without a voice make their presence felt.
Prejudice can be tempered (or intensified) through education coupled with policies that further public participation and socioeconomic equality. But it can’t be “cured.” The struggle against bigotry, no less than the struggle for freedom, has no fixed end; it is not identifiable with any institution, movement, or program. Resistance is an ongoing process built upon the guiding vision of a society in which the free development of each is the condition for the free development of all.
Like a t-shirt that used to say something you can’t quite read anymore, a piece of terminology will sometimes grow so faded, or be worn so thin, that retiring it seems long overdue. The threadbare expression “socially constructed” is one of them. It’s amazing the thing hasn’t disintegrated already.
In its protypical form -- as formulated in the late 1920s, in the aphorism known as the Thomas theorem – the idea was bright and shapely enough: “If men define situations as real, they are real in their consequences.” In a culture that regards the ghosts of dead ancestors as full members of the family, it’s necessary to take appropriate actions not to offend them; they will have a place at the table. Arguments about the socially constructed nature of reality generalize the Thomas theorem more broadly: we have access to the world only through the beliefs, concepts, categories, and patterns of behavior established by the society in which we live.
The idea lends itself to caricature, of course, particularly when it comes to discussion of the socially constructed nature of something brute and immune to argumentation like, say, the force of gravity. “Social constructivists think it’s just an idea in your head,” say the wits. “Maybe they should prove it by stepping off a tall building!”
Fortunately the experiment is not often performed. The counterargument from gravity is hardly so airtight as its makers like to think, however. The Thomas theorem holds that imaginary causes can have real effects, But that hardly implies that reality is just a product of the imagination.
And as for gravity -- yes, of course it is “constructed.” The observation that things fall to the ground is several orders of abstraction less than a scientific concept. Newton’s development of the inverse square law of attraction, its confirmation by experiment, and the idea’s diffusion among the non-scientific public – these all involved institutions and processes that are ultimately social in nature.
Isn’t that obvious? So it seems to me. But it also means that everything counts as socially constructed, if seen from a certain angle, which may not count as a contribution to knowledge.
A new book from Temple University Press, Darin Weinberg’s Contemporary Social Constructionism: Key Themes, struggles valiantly to defend the idea from its sillier manifestations and its more inane caricatures. The author is a reader in sociology and fellow at King’s College, University of Cambridge. “While it is certainly true that a handful of the more extravagant and intellectually careless writers associated with constructionism have abandoned the idea of using empirical evidence to resolve debates,” he writes, not naming any names but manifestly glaring at people over in the humanities, “they are a small and shrinking minority.”
Good social constructionist work, he insists, “is best understood as a variety of empirically grounded social scientific research,” which by “turn[ing] from putatively universal standards to the systematic scrutiny of the local standards undergirding specific research agendas” enables the forcing of “the tools necessary for discerning and fostering epistemic progress.”
The due epistemic diligence of the social scientists renders them utterly distinct from the postmodernists and deconstructionists, who, by Weinberg's reckoning, have done great damage to social constructionism’s credit rating. “While they may encourage more historically and politically sensitive intuitions regarding the production of literature,” he allows, “they are considerably less helpful when it comes to designing, implementing, and debating the merits of empirically grounded social scientific research projects.”
And that is being nice about it. A few pages later, Weinberg pronounces anathema upon the non-social scientific social-constructionists. They are “at best pseudo-empirical and, at worst, overtly opposed to the notion that empirical evidence might be used to improve our understanding of the world or resolve disputes about worldly events.”
Such hearty enthusiasm for throwing his humanistic colleagues under the bus is difficult to gainsay, even when one doubts that a theoretical approach to art or literature also needs to be “helpful when it comes to designing, implementing, and debating the merits of empirically grounded social scientific research projects.” Such criticisms are not meant to be definitive of Weinberg’s project. A sentence like “Derrida sought to use ‘deconstruction’ to demonstrate how specific readings of texts require specific contextualizations of them” is evidence chiefly of the author’s willingness to hazard a guess.
The book’s central concern, rather, is to defend what Weinberg calls “the social constructionist ethos” as the truest and most forthright contemporary manifestation of sociology’s confidence in its own disciplinary status. As such, it stresses “the crucially important emphases” that Weinberg sees as implicit in the concept of the social – emphases “on shared human endeavor, on relation over isolation, on process over stasis, and on collective over individual, as well as the monumental epistemic value of showing just how deeply influenced we are by the various sociohistorical contexts in which we live and are sustained.”
But this positive program is rarely in evidence so much as Weinberg’s effort to close off “the social” as something that must not and cannot be determined by anything outside itself – the biological, psychological, economic, or ecological domains, for example. “The social” becomes a kind of demiurge: constituting the world, then somehow transcending its manifestations.
It left this reader with the sense of witnessing a disciplinary turf war, extended to almost cosmological dimensions. The idea of social construction is a big one, for sure. But even an XXL can only be stretched just so far before it turns baggy and formless -- and stays that way for good.
So you almost have that book contract in your grasp. You’ve had your most trusted colleagues drop a favorable hint about your work in the ear of the acquisitions editor at the best press in your field. You carefully (and, of course, unobtrusively) stalked said editor at the spring meeting of your disciplinary society, and managed to “accidentally” meet at the drinks reception.
You wrote a follow-up e-mail — not too soon, not too late — with a general query describing your idea and how it fits into the broader publication program at Desirable University Press. And when you received back that warm response — O, happy day! — you observed a decent interval before sending off your polished proposal, on which, of course, you’ve been working ceaselessly for the last six months.
And now you’re refreshing your inbox every five minutes or so, waiting for that hoped-for green light.
Did you ever think — after all your work — that what you were producing was a luxury?
Probably not. All you really want is for the best publisher, whatever that means to you, to publish it; and for your ideas to receive notice in the reviews that matter in your field. Well, you’d probably like your promotion and tenure committee to be impressed, too. Royalties would be nice, but more than anything, you want impact.
Yet maybe you think it should be a luxury, after all the effort and sweat and heartache you’ve invested in it. As far as you’re concerned, it’s pure gold, and should be priced accordingly. You can be sure it will. According to one book provider for university libraries, the average cover price of an academic book now stands at around $90.00 — a few multiples more than the average price of a book.
It’s not just the price that makes scholarly books a luxury. Think about this line from a recent study of luxury goods: “In luxury, quality is assumed, price does not have to be explained rationally; it is the price of the intangibles (history, legend, prestige of the brand)."
That sounds a lot like the system of scholarly publishing we have come to know and love (and/or loathe). It’s exactly the history, legend, prestige of the brand — the welter of such elements as the name of a given press, the backlist of titles in its catalog, the reputation of the institution with which it is (to a greater or lesser degree) affiliated, the grand old stories we tell about the way a certain editor championed a book against a sea of troubles — that gives the whole enterprise a whiff of mystique and nobility. Scholarly publishing, like any other luxury good, is a reputation-driven business producing goods for a select few at high prices, which in turn transmit a signal about the value of the good — and the prestige of the producer.
But as any social psychologist can tell you, reputations are a bad shortcut to reality. On the contrary, they can be a fruitful source of bias — filled with meaning we make instead of content we assess.
If you think about it, it’s surprising that scholarly publishing is — and seemingly should be — a business in which brand reputation is not just operative, but essential. Stories abound of promotion and tenure committees advising candidates of the four or five publishers with which a book they present must be placed — at least if they have hopes of further advancement. But of course to say this is to mistake the brand for the content. After all, scholarly merit is supposed to be a function of, well, merit, not mere reputation. Isn’t it? Aren’t we supposed to read the books, and not merely the spine?
• • •
The old chestnut that academic publishing is in a state of crisis may or may not be true; that all depends on your definition of “crisis.” What is certainly true is that the nature of scholarly publishing has changed, in some ways so much that it would scarcely be recognizable to the founding generation of university press directors.
After all, it is only meaningful to distinguish “scholarly publishing” from all other sorts of publishing if it has not just a distinctive content but a distinctive purpose.
The content is indisputably meant to be scholarly work of great merit. Even within a single field disagreements may (and do) arise about exactly what merit is, but no one seriously disputes that the content provided by academic presses is, or ought to be, characterized by a kind of defensible and substantive merit.
That is to say, scholarly publishing — at least in the days American university presses were established — was seen as a way for scholars to communicate their ideas with each other in ways that would not depend, at least not critically, on the market. Exactly because the market would be a poor judge of scholarly merit, producing scholarly work was seen as an extension of institutional mission. Colleges and universities exist not merely to create, but to communicate knowledge; and the social privileges conferred because of that mission (notably, qualification to receive charitable gifts incentivized by the tax code) entail social responsibilities to support both the process and the production of research.
So here’s a thesis. If there truly is a crisis in scholarly publishing, it has arisen from this fundamental first cause: the end of the era in which institutions sponsoring presses saw the publishing of scholarship as something near to the heart of their core mission, and deserving to be supported on those terms. Result: What was never intended to be a system left to the vicissitudes of the market has become exactly that. Scholarly books have become high-priced, prestige-driven luxury goods not by accident, but by forgetfulness.
Symptoms of this shift abound. Presses unable to break even are closed, or severely curtailed, as universities refocus on “strategic priorities." Book prices rise at a rate far higher than inflation in order to cover publishers’ fixed costs as institutional subventions vanish. Authors are chosen not so much on the basis of prize-winning, promising early work but rather because they can command the services of a literary agent.
It doesn’t have to be this way. To solve the crisis we should speak frankly of its causes, and imagine alternatives to received structures. There are three points to keep in view as we invent and test alternatives.
• Open access doesn’t mean poor quality. The push for open access, an idea received with acute suspicion in some quarters, has come about in no small way as a direct consequence of the predictable failure of a market-based system for scholarly publishing to serve its audience.
As a species we are pretty hardwired to associate cost with value — one reason why luxury goods, for which no rational explanation can suffice, yet exist. That is the hardest challenge for open-access advocates (of which I am one) to overcome; how can something free be trusted? But there is no logical connection between the price (as distinguished from the production cost) of a scholarly work and its merit. Yes, assuring quality is a costly business. But there are other ways of paying those costs than depending on purchase-price revenue.
• Communicating ideas is (or should be) critical to the mission of all institutions. The relationship between publishing and the institutional mission needs to be reassessed. Real and lasting change in the broken system of scholarly communication cannot be accomplished by publishers, or libraries, alone. Ultimately it will take a critical mass of institutional leaders able to see how abandoning academic presses to the market was, in effect, abdicating a core scholarly responsibility. I am fortunate to work in an institution led by such people, with the result that the revenue on which we will do the expected work of assuring quality and publishing scholarship will be borne by institutional commitments instead of consumers.
• Disruptive innovation is messy. Changing the revenue model — shifting the source of the revenue from either end of the value chain (purchases by consumers at one end, or “author fees” at the other) to institutional commitments at the center — is made possible by new technologies for distribution (digital publishing). But will also mean the emergence of a new set of ideas for the kinds of institutions that do scholarly publishing.
For one thing, there may well be a larger number of publishers producing a smaller number of works on a focused set of topics. Most of the proposed solutions to the “crisis,” both those offered by publishers and those sponsored by foundations, have been essentially focused on preserving the current demographic profile of university presses. It is not self-evident that this is the only solution. Liberal arts colleges (to cite my own example) have a valuable and distinct contribution to make to the identification of what constitutes “scholarship” — but, with a few admirable exceptions, have been frozen out of the conversation by the sheer volume of production required by a market-dependent system. That can now change.
So, too, digital tools make possible not only different ways of producing work, but different ways of organizing the work of publishers. University presses, by and large, are organized as hierarchical firms — and with good reason; such organizations manage market pressures efficiently. But academic publishing could become much more like a commons, adapting to its own purposes Yochai Benkler’s ideas of commons-based peer production in which the uniting thread is a shared passion for the development and distribution of new ideas among colleagues and peers. Said in different terms, what if the future of academic publishing looked less like the Encyclopædia Britannica, and more like Wikipedia?
Good luck on the book contract. When you get it — and, of course, you will — remember why you got into your field in the first place. It probably wasn’t to produce luxuries, but to create ideas and communicate them to your peers — the same reason I wrote this piece. So when you have an idea for your next book, think about working with a publisher who shares those goals.
Mark Edington is director of the Amherst College Press.
At a certain age, you find the slang of the day growing a bit opaque or slippery. Using it becomes a calculated risk. Not that the words or usages are necessarily incomprehensible, though some of them are. (The word “random” now has implications in the American vernacular that I have yet to figure out.) But the unwritten rules of informal correctness are sometimes tricky, and mastering them a challenge.
Simon Blackburn, for many years a professor of philosophy at Oxford, makes use of one recent idiom in his new book Mirror, Mirror: The Uses and Abuses of Self-Love (Princeton University Press) and nearly gets it right. Complaining about “the swarms of ‘selfies’ who infest places of interest, art galleries concerts, public spaces, and cyberspace,” he elaborates:
“For today’s selfie, the object of each moment is first to record oneself as having been there and second to broadcast the result to as much of the rest of the world as possible. The smartphone is the curse of public space as selfies click away with the lens pointed mainly at themselves and only secondarily at what is around them.”
And so on, in the same vein. ("You selfies get out of my yard!” as it were.) Reaching for the meaning of a recent piece of slang, Blackburn has turned in the right direction but not quite grasped it: “Selfie” refers to a photographic self-portrait, not to the person taking the picture. The author is editor of The Oxford Dictionary of Philosophy, so if there is a term for this kind of reversal -- confusing object and subjectivity -- he probably knows it.
But then he may not have fallen behind on the slang at all. “Selfie” really does sound like an apt term for the digital-age clones of Narcissus. Perhaps it will catch on? The phenomenon itself won’t be disappearing any time soon. The recent wildfires in California give us some idea of what an apocalyptic future would look like: people snapping selfies while everything around them is consumed in flames.
The appetite for self-documentation is not just a moral vice or a cultural symptom. The nuisance factor of the selfie is only the most blatant aspect of a tendency that Blackburn identifies as a fundamental problem for both serious thought and everyday life. “With a few exceptions,” he notes, "we can have just about any attitude toward ourselves that we have toward other people, or even to things in the world.”
Ordinary language is full of evidence for this point, since the range of expressions with “self-“ as a prefix is incredibly wide and practically uncountable: self-abasement, self-advancement, self-denial, self-respect, self-education, self-inspection, self-consciousness, self-murder…. For now that list should be sufficient, if not self-sufficient. "The exceptions,” Blackburn notes, "only include such trivial things as finding you in my way, which is possible, as opposed to finding myself in my way, which is arguably not, except metaphorically when perhaps it is all too possible.”
So the epistemological and moral problems raised by our relationship with other people or the world (where did they come from? what can we know about them? how should we treat them?) also apply in regard to the self, and are at least as complex, though probably more so. It seems the Oracle of Delphi cunningly hid a vast array of questions in her challenge to Socrates: Know thyself.
After 2,500 years, the mystery has only deepened. Writing in the 18th century, David Hume found the self to be elusive: “For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure…. When my perceptions are removed for any time, as by sound sleep; so long as I am insensible of myself, and may truly be said not to exist.”
Now we map the neural pathways of the brain in fine detail without ever locating the spot where the self picks up its mail. Even so, whole industries exist to provide goods and services to the self -- especially by selling it, if not power, then at least Band-Aids for wounded vanity. Mirror, Mirror quotes the apt definition of arrogance by Kant -- “an unjustified demand that others think little of themselves in comparison with us, a foolishness that acts contrary to its own end” -- that conveys how repulsive and counterproductive it is. And yet costly advertising campaigns are built around models who project, as Blackburn neatly expresses it, "the vacant euphoria” of absolute self-absorption: “even [their] blankness is a kind of disdain -- a refusal of human reciprocity.”
The ads embody the fantasy of an invulnerable self, inspiring awe but immune to envy, available for the price of hair-care products or new clothes. It’s a con, but a successful one, meaning that it leaves the mark poorer but no wiser. (Hence still vulnerable.)
Mirror, Mirror is not primarily a work of social or cultural criticism. At the same time, phenomena such as selfies or billboards filled with pouty sneering beautiful people are more than just irritants that provoke Blackburn into writing casual but learned essays that leave you wanting to read (or reread) the philosophical and literary works he draws on. That, too. But the author's shuttling between current trends and venerable texts seems enlivened by the occasional hint that his interest is, in part, personal.
That could be my imagination. Blackburn never waxes memoiristic; he uses the first person sparingly. Still, the book implies a quest, Socrates-like, for self-knowledge -- by no means to be confused with what Narcissus was after.