Review of 'Philip Sparrow Tells All'

A novelist and English professor named Samuel M. Steward was fired by DePaul University in 1956 for the offense of running a tattoo parlor on Chicago’s Skid Row. He did not have the option (so readily taken for granted these days) of explaining it as full-immersion ethnographic research, nor did the fact that he’d practiced this sideline under a pseudonym, Philip Sparrow, count as mitigation. By then Steward was in his late 40s and had been teaching for well over 20 years, but his academic career was finished.

It was a moment of emergence, however, not of decline. Within a few years, the defrocked professor moved to California. His artistry with the ink gun put Philip Sparrow in demand among the Hell’s Angels, whose standards are rigorous and exacting to a degree academe never quite manages. (Being thrown out of the Angels can include relinquishing one’s tattoos, a process that sometimes involves a blowtorch.)

In the late 1970s, he went back to using his given name and under it published, among other things, a collection of letters from his friends Gertrude Stein and Alice B. Toklas. But use of a pseudonym seems to have permitted the flourishing of aspects of his creative identity that might have gone unexpressed. Besides his renown among tat connoisseurs as Philip Sparrow, he also wrote a considerable amount of pornographic fiction under the name Phil Andros -- which was kind of a meta pun, splitting the Greek components of “philanderer” (a man who has sex with a great many women) and repurposing them for gay use (“lover of men”).

He died in 1993 at the age of 84, leaving behind the papers that allowed Justin Spring to write Secret Historian: The Life and Times of Samuel Steward (Farrar, Straus and Giroux), a National Book Award Finalist for 2010. The big online booksellers show the fiction of Phil Andros to be available and in demand, although nearly everything Steward published under his own name has long since gone out of print. But the University of Chicago Press has now added something to his stature as an author by publishing Philip Sparrow Tells All: Lost Essays by Samuel Steward, Writer, Professor, Tattoo Artist, edited by Jeremy Mulderig -- who, by a nice bit of karma, happens to be an associate professor emeritus of English at DePaul University.

Occasionally a book’s subtitle all but defies the reader not to have a look, and in this case the photos of the author on the front cover alone are pretty striking, given Steward's resemblance to John Waters. The contents of the volume are selected from the column Steward wrote, under the Philip Sparrow byline, for the Illinois Dental Journal between 1944 and 1949.

So the publisher’s description says: I did not make it up, nor could I. While Justin Spring’s biography of Steward from five years ago had been widely and well reviewed, I had not heard about it, and so I suspected, for a moment, that Philip Sparrow Tells All was a prank, either by the University of Chicago Press or on it. The essays of a tattoo artist recovered from 70-year-old issues of the Illinois Dental Journal? Come on.

Exercising due diligence, I learned just enough to confirm that the author actually existed -- then decided to stop reading more about him. First, I wanted to read some of the essays themselves. The world is full of colorful characters who try to write, but eccentricity and adventurousness are not, in practice, qualifications for authorship. (To their credit they sometimes recognize this and offer to tell a writer their stories, in exchange for a share of the advance.) So I skipped the book’s introductory matter and the headnotes the editor had prepared for each piece and went right to Steward’s own prose.

The first selection, his inaugural column, was indeed written with the publication’s audience in mind: “The Victim’s Viewpoint: On Sublimated Sadism; or, the Dentist as Iago.” The tone or manner is approximately that of Robert Benchley:

“We have opened our mouth widely for many of these torturers, from Maine to Montana, and we are ready to swear that on more than one occasion -- as we have been approached, lying there helpless and trembling -- we have seen a diabolic gleam in their eyes as they reached for their tools. There is one certain prober, doubtless invented by Beelzebub, which they use when they begin their preliminary surveying. It is shaped vaguely like a sophisticated corkscrew, and is evidently intended to search out the secret places of one’s heart; we personally have felt it go even lower, and are sure it once left a scar on our right kidney. … but let us draw a curtain over this painful scene; even in thinking of it we have worked ourselves into a cold dribble.”

Something like this essay probably appeared at in every college humor magazine in the country at least once per issue for a decade on either side of its January 1944 publication date. It seems well-enough done for what it is; the best that might be said for it is that the author cleared his throat.

An abrupt shift in topic and style comes with the following piece, “On Cryptography,” published that October -- a sprightly introduction to a matter of great wartime interest. The title sounds like an allusion to the essays of Montaigne, and where the Iago reference in his debut seemed arch and contrived, here Steward’s use of classical and contemporary references (e.g., calling Suetonius “the Walter Winchell of ancient Rome”) proves both humorous and apropos. The next month’s column “On Alcoholics Anonymous” -- explaining the principles of an organization just beginning to catch the public’s attention -- comes about as close to memoir as possible without violating the distance implied by the authorial “we.”

It’s a remarkable progression in just three essays, and it doesn’t end there. With the measure of safety provided by a pseudonym -- and also by the less-than-mass circulation of the Illinois Dental Journal -- Steward experimented with the comic, personal and confessional modes of the casual essay in ways that might have been difficult to risk otherwise.

After sampling enough of the book to determine that the columns were of interest in their own right, rather than as the supplement to the author’s biography, I started reading Jeremy Mulderig’s introductory material. It clarifies a great deal, beginning with the essayist’s venue: Steward was attracted to his dentist, who happened to be the magazine’s editor. Its more typical fare was articles with titles like “Important Considerations in Porcelain Veneer Restoration,” but a column written from a nonprofessional vantage point seemed worthwhile, if only for variety. The dentist accepted Steward’s offer to write for the journal, though not, it seems, his other propositions.

After writing several pieces for “The Victim’s Viewpoint” (the column’s title for most of 1944), Steward decided to reboot it as something more wide-ranging. Which explains the nine-month gap between the first and second selections in Philip Sparrow Tells All, and the marked change in the writing itself. Including just one piece from the column’s beta version seems like a wise choice on Mulderig’s part. The wit and whimsy of dentistry as seen from the patient’s-eye view must have been stretched pretty thin after a couple of months.

Many of the columns take on a more humorous quality when you know that the author had a sex life active enough to impress Alfred Kinsey. And no doubt that will be a selling point for the book. But the tension between overt statement and implicit meaning can have effects other than amusement, and in the case of one essay, that tension seems especially powerful.

Published in February 1945, it anticipates the difficulties ahead as American society tries to reabsorb returning servicemen (and vice versa). Here is one passage:

“Only those who have been shot at can love and understand each other. We at home can never comprehend the powerful fraternalism that unites the men who belong, by reason of their experiences, to the ghostly brotherhood of war. When death is behind a bush that trembles, when it explodes in burning phosphorous to kill the friend who was joking a moment before, when it surrounds you with bodies black with flies and bloated by the sun until they at last explode, when your foot slides upon the stinking decayed intestines of a thing that was once a man -- only then, after the bony fingers have inscribed the membership card with your name, and you have looked into the fearful emptiness of the sockets in a fleshless skull, are your dues paid and you yourself a member of the League of War. … They have their own code of morals which we cannot possibly understand, and which will baffle and dismay us utterly. They will be startled and chagrined by what they will consider our indifference, but is really only our own inexperience slowly woven around us in our geographically and emotionally isolated chrysalis.”

Meaningful enough as these lines are on the most manifest level, they take on even more significance in the light of Alan Bérubé’s Coming Out Under Fire: The History of Gay Men and Women in World War Two (Free Press, 1990). Bérubé showed how important the experience of the war was to the formation of a sense of gay identity and community in the United States.

Steward himself was a Naval enlistee but did not see combat. There is an ambivalence, intimacy, pain and sadness to the essay that can be felt by a reader who knows nothing about the author. But it seems clear that the traumatized fighting men he depicts weren’t sociological abstractions but friends and lovers.

It bears reiterating that the name under which he published the essay, Philip Sparrow, was the one he later used as a tattoo artist -- and the one he preferred to go by for some while after being expelled from the groves of academe. It was the identity he assumed at the limits of permissible expression. “Man is least himself when he talks in his own person,” wrote Oscar Wilde. “Give him a mask, and he will tell you the truth.”

Editorial Tags: 

Commentary on Heidegger's Black Notebooks

Every few years, somebody notices that Martin Heidegger was a Nazi -- and it all starts up again: the polemics, the professions of shock, the critiques of his philosophy’s insidious role in the humanities. At times the denunciations have a rather generic quality, as if a search-and-replace macro had been used to repurpose a diatribe again John Dewey or Jacques Derrida. Calls for a boycott of Heidegger’s writings are made, issued by people who cannot name two of them.

The Heidegger bashers tend to be the loudest, but there are counterdemonstrators. Besides the occasional halfhearted search for mitigating circumstances (the Weimar Republic did not make for clear thinking, after all, and the man’s thought was normally pitched at stratospheric levels of abstraction rather than the grubby domain of party politics) there is the sport of itemizing the anti-Heideggerians’ lapses in scholarship. Every line of argument on either side of the dispute was established during the controversy provoked by Victor Farias’s Heidegger and Nazism (1987), yet l’affaire Heidegger has been recycled on at least three or four occasions since then. It’s as if the shock of the scandal was so great that it induced amnesia each time.

The most recent episode (Heidegger Scandal 5.0?) followed the publication in Germany, last year, of the first batch of the philosophical and political musings that Heidegger began recording in notebooks from 1931 onward. An English translation is forthcoming, so count on the outrage to renew in about six months. In the meantime, let me recommend a sharp and succinct overview of the Heidegger matter that may be of interest to anyone who hasn’t caught the earlier reruns. It appeared in the interdisciplinary journal Science & Society under the title “Notes on Philosophy in Nazi Germany.” The author, V. J. McGill, was for many years a professor of philosophy and psychology at Hunter College. “In the midst of the disillusionment and insecurity of postwar Germany and emerging fascism,” he wrote:

“Heidegger saw in care (Sorge) and anxiety (Angst), the basic substance of man. Instead of offering a rational solution of some kind he devoted himself to fine-spun philological distinctions, to an analysis of the pivotal concept of ‘nothing’ and to a subtle exploration of ‘death’ of which he says that we experience it only in the mode of ‘beside’ -- that is, beside death. History, culture, freedom, progress are illusory. He finds our salvation in the recovery of a primordial sense of coexistence with other beings, that is, a bare feeling of togetherness, deprived of all the amenities and hopes which make social life worth while ….

“The hundreds who flocked to Heidegger's very popular lectures in Freiburg learned that anxiety is the final, irremedial substance of man, and left with such esoteric wisdom as their main reward. Heidegger's philosophy was not distasteful to the Nazis, and when he was made rector of the University of Freiburg, he gave an address praising the new life which animated German universities. In recent years a rift has occurred. But philosophers can fall out with the Nazis on other grounds than their ideas and doctrines.”

McGill’s article was published in 1940. Over the intervening three quarters of a century, additional details have emerged, including documentation that Heidegger was not just an ally of the Nazi Party but also a full member from 1933 to 1945. And interest in his work on the part of several generations of philosophers who never showed the slightest bent towards fascism has meant much debate over the validity of reducing Heidegger’s philosophical concepts to their political context. But for all the anger that simmers in McGill’s discussion of Heidegger as an academic lackey of the Third Reich, his account is matter-of-fact and nonsensationalist, and little of the recent commentary can be said to improve upon it.

The Black Notebooks, as Heidegger’s private ruminations are known, sound ghastly on a number of fronts. The volumes published so far cover the years 1931 through 1941. Those covering the rest of the war years are being edited, and Heidegger is reported to have continued keeping the notebooks until his death in 1976. Richard Polt, a professor of philosophy at Xavier University and a translator of Heidegger’s work, identifies 19 passages (out of about 1,200 pages) that attack Jews in terms that might as well have come from an editorial by Joseph Goebbels. After the war Heidegger claimed to have become disillusioned with the Nazis within a couple of years of joining the party -- and the notebooks show this to have been true, strictly speaking. But his objections were to the boorishness and careerism of men who didn’t share his lofty understanding of Hitler’s ideology.

As with the anti-Semitism, this does not come as a revelation, exactly. His reference to “the inner truth and greatness of National Socialism” in a lecture from 1935 remained in the text when he published it in 1953. Beyond defiant, it was a gesture indicating a certain megalomania: Heidegger hadn’t betrayed the Fuhrer’s vision, the Nazis had!

But as David Farrell Krell, a professor emeritus of philosophy at DePaul University, suggests in a recent issue of Research in Phenomenology, the Black Notebooks reveal not just disappointment with the regime (combined with perfect callousness towards its brutality) but levels of rage, bile and despair that keep him from thinking. Heidegger cannot challenge himself, only repeat himself. “From day to day and day after day,” Krell says, Heidegger “entirely forgets that he has written the same things over and over again with the identical amount of dudgeon.”

Heidegger loathed Freud and psychoanalysis, which only makes it tempting to subject him to a little armchair diagnostics. But Krell's point, if I understand him correctly, is that the repetitiveness is more than symptomatic; the Black Notebooks document Heidegger not as a philosopher seduced by totalitarian politics, but as someone who has quit blazing a pathway of thought and instead become trapped in a maze of his own fabrication. Unfortunately, he is not the only one so trapped:

“At least part of the allure of the ongoing Heidegger scandal,” writes Krell in a passage that lights up the sky, “is that it distracts us from our own appalling national stupidities and our galling national avarice -- our own little darkenings, if you will. It is so much easier to fight battles that have already been decided and so lovely to feel oneself securely moored in the harbor of god’s own country. Not the last god but the good old reliable one, who blesses every stupidity and earns interest on every act of avarice. … The irony is that Heidegger’s Notebooks themselves reflect this dire mood. Perhaps by condemning him and them, we hope to gain a bit of space for ourselves, some impossible space for ourselves? That is the shadow these Notebooks cast over those who are so anxious to condemn. And that would be the Notebooks’ most terrible victory: it is not that the last laugh laughs best, for there is no joy and no laughter in them, but that their helpless rage recurs precisely in those who rail against them.”

Remember that next spring, when the controversy starts up once more.

Editorial Tags: 
Image Caption: 
Martin Heidegger

Review of Finn Brunton and Helen Nissenbaum, 'Obfuscation: A User's Guide for Privacy and Protest'

When downloading an app or approving a software update, I now usually hesitate for a moment to consider something the comedian John Oliver said early this summer: a software company could include the entire text of Mein Kampf in the user agreement and people would still click the “agree” button.

“Hesitates” is the wrong word for something that happens in a fraction of a second. It’s not as if I ever scrolled back through to make sure that, say, Microsoft is not declaring that it owns the copyright to everything written in OneNote or Word. The fine print goes on for miles, and anyway, a user agreement is typically an all-or-nothing proposition. Clicking “agree” is less a matter of trust than of resignation.

But then, that’s true about far more of life in the contemporary digital surround than most of us would ever want to consider. Every time you buy something online, place a cell phone call, send or receive a text message or email, or use a search engine (to make the list no longer nor more embarrassing than that), it is with a likelihood, verging on certainty, that the activity has been logged somewhere -- with varying degrees of detail and in ways that might render the information traceable directly back to you. The motives for gathering this data are diverse; so are the companies and agencies making use of it. An online bookseller tracks sales of The Anarchist Cookbook in order to remind customers that they might also want a copy of The Minimanual of the Urban Guerrilla, while the National Security Administration will presumably track the purchase with an eye to making correlations of a different sort.

At some level we all know such things are happening, probably without thinking about it any more often than strictly necessary. Harder to grasp is the sheer quantity and variety of the data we generate throughout the day -- much of it trivial, but providing, in aggregate, an unusually detailed map of what we do, who we know and what’s on our minds. Some sites and applications have “privacy settings,” of course, which affect the totality of the digital environment about as much as a thermostat does the weather.

To be a full-fledged participant in 21st-century society means existing perpetually in a state of information asymmetry, in the sense described by Finn Brunton and Helen Nissenbaum in Obfuscation: A User’s Guide for Privacy and Protest (MIT Press). You don’t have to like it, but you do have to live with it. The authors (who teach media culture and communications at New York University, where they are assistant professor and professor, respectively) use the term “obfuscation” to identify various means of leveling the playing field, but first it’s necessary to get a handle on information asymmetry itself.

For one thing, it is distinct from the economic concept of asymmetrical information. The latter applies to “a situation in which one party in a transaction has more or superior information compared to another.” (So I find it explained on a number of websites ranging from the scholarly to the very sketchy indeed.) The informed party has an advantage, however temporary; the best the uninformed can do is to end up poorer but wiser.

By contrast, what Brunton and Nissenbaum call information asymmetry is something much more entrenched, persistent and particular to life in the era of Big Data. It occurs, they explain, “when data about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand.” It has an economic aspect, but the implications of information asymmetry are much broader.

“Our data will be shared, bought, sold, analyzed and applied, all of which will have consequences for our lives,” the authors write. “Will you get a loan, or an apartment, for which you applied? How much of an insurance risk or a credit risk are you? What guides the advertising you receive? How do so many companies and services know that you’re pregnant, or struggling with an addiction, or planning to change jobs? Why do different cohorts, different populations and different neighborhoods receive different allocations of resources? Are you going to be, as the sinister phrase of our current moment of data-driven antiterrorism has it, ‘on a list’?”

Furthermore (and here Brunton and Nissenbaum’s calm, sober manner can just barely keep things from looking like one of Philip K. Dick’s dystopian novels), we have no way to anticipate the possible future uses of the galaxies of personal data accumulating by the terabyte per millisecond. The recent series Mr. Robot imagined a hacker revolution in which all the information related to personal debt was encrypted so thoroughly that no creditor would ever have access to it again. Short of that happening, obfuscation may be the most practical response to an asymmetry that’s only bound to deepen with time.

A more appealing word for it will probably catch on at some point, but for now “obfuscation” names a range of techniques and principles created to make personal data harder to collect, less revealing and more difficult to analyze. The crudest forms involve deception -- providing false information when signing up with a social media site, for example. A more involved and prank-like approach would be to generate a flood of “personal” information, some of it true and some of it expressing one’s sense of humor, as with the guy who loaded up his Facebook profile with so many jobs, marriages, relocations, interests and so on that the interest-targeting algorithms must have had nervous breakdowns.

There are programs that will click through on every advertisement that appears as you browse a site (without, of course, bothering you with the details) or enter search engine terms on topics that you have no interest in, thereby clouding your real searches in a fog of irrelevancies.

The cumulative effect would be to pollute the data enough to make tracking and scrutiny more difficult, if not impossible. Obfuscation raises a host of ethical and political issues (in fact the authors devote most of their book to encouraging potential obfuscators to think about them) as well as any number of questions about how effective the strategy might be. We’ll come back to this stimulating and possibly disruptive little volume in weeks to come, since the issues it engages appear in other new and recent titles. In the meantime, here is a link to an earlier column on a book by one of the co-authors that still strikes me as very interesting and, alas, all too pertinent.

Editorial Tags: 

Review of Jacques Le Goff, 'Must We Divide History Into Periods?'

George Orwell opened one of his broadcasts on the BBC in the early 1940s by recounting how he’d learned history in his school days. The past, as his teachers depicted it, was “a sort of long scroll with thick black lines ruled across it at intervals,” he said. “Each of these lines marked the end of what was called a ‘period,’ and you were given to understand that what came afterwards was completely different from what had gone before.”

The thick black lines were like borders between countries that didn’t know one another’s languages. “For instance,” he explained, “in 1499 you were still in the Middle Ages, with knights in plate armour riding at one another with long lances, and then suddenly the clock struck 1500, and you were in something called the Renaissance, and everyone wore ruffs and doublets and was busy robbing treasure ships on the Spanish Main. There was another very thick black line drawn at the year 1700. After that it was the Eighteenth Century, and people suddenly stopped being Cavaliers and Roundheads and became extra-ordinarily elegant gentlemen in knee breeches and three-cornered hats … The whole of history was like that in my mind -- a series of completely different periods changing abruptly at the end of a century, or at any rate at some sharply defined date.”

His complaint was that chopping up history and simplistically labeling the pieces was a terrible way to teach the subject. It is a sentiment one can share now only up to a point. Orwell had been an average student; today it would be the mark of a successful American school district if the average student knew that the Renaissance came after the Middle Ages, much less that it started around 1500. (A thick black line separates his day and ours, drawn at 1950, when television sales started to boom.)

Besides, the disadvantages of possessing a schematic or clichéd notion of history are small by contrast to the pleasure that may come later, from learning that the past was richer (and the borders between periods more porous) than the scroll made it appear.

Must We Divide History Into Periods? asked Jacques Le Goff in the title of his last book, published in France shortly before his death in April 2014 and translated by M. B. DeBevoise for the European Perspectives series from Columbia University Press. A director of studies at L'École des Hautes Études en Sciences Sociales in Paris, Le Goff was a prolific and influential historian with a particular interest in medieval European cities. He belonged to the Annales school of historians, which focused on social, economic and political developments over very long spans of time -- although his work also exhibits a close interest in medieval art, literature and philosophy (where changes were slow by modern standards, but faster than those in, say, agricultural technique).

Le Goff’s final book revisits ideas from his earlier work, but in a manner of relaxed erudition clearly meant to address people whose sense of the past is roughly that of young Orwell. And in fact it is that heavy mark on the scroll at the year 1500 -- the break between the Middle Ages and the Renaissance -- that Le Goff especially wants the student to reconsider. (I say “student” rather than “reader” because time with the book feels like sitting in a lecture hall with a memorably gifted teacher.)

He quotes one recent historian who draws the line a little earlier, with the voyage of Christopher Columbus in 1492: “The Middle Ages ended, the modern age dawned, and the world became suddenly larger.” Le Goff is not interested in the date but in the stark contrast that is always implied. Usually the Middle Ages are figured as “a period of obscurity whose outstanding characteristic was ignorance” -- happily dispelled by a new appreciation for ancient, secular literature and a sudden flourishing of artistic genius.

Calling something “medieval” is never a compliment; the image that comes to mind is probably that of a witch trial. By contrast, “Renaissance” would more typically evoke a page from Leonardo da Vinci’s notebooks. Such invidious comparison is not hard to challenge. Witch trials were rare in the Middle Ages, while the Malleus Maleficarum appeared in “the late fifteenth century,” Le Goff notes, “when the Renaissance was already well underway, according to its advocates.”

Given his expertise on the medieval city -- with its unique institutional innovation, the university -- Le Goff makes quick work of demolishing the notion of the Middle Ages having a perpetually bovine and stagnant cultural life. The status of the artist as someone “inspired by the desire to create something beautiful” who had “devoted his life to doing just this” in pursuit of “something more than a trade, nearer to a destiny,” is recognized by the 13th century. And a passage from John of Salisbury describes the upheaval underway in the 12th century, under the influence of Aristotle:

“Novelty was introduced everywhere, with innovations in grammar, changes in dialectic, rhetoric declared irrelevant and the rules of previous teachers expelled from the very sanctuary of philosophy to make way for the promulgation of new systems …”

I can’t say that the name meant anything to me before now, but the entry on John of Salisbury in the Stanford Encyclopedia of Philosophy makes it sound as if Metalogicon (the work just quoted) was the original higher ed polemic. It was “ostensibly written as a defense of the study of logic, grammar and rhetoric against the charges of the pseudonymous Cornificius and his followers. There was probably not a single person named Cornificius; more likely John was personifying common arguments against the value of a liberal education. The Cornificians believe that training is not relevant because rhetorical ability and intellectual acumen are natural gifts, and the study of language and logic does not help one to understand the world. These people want to seem rather than to be wise. Above all, they want to parlay the appearance of wisdom into lucrative careers. John puts up a vigorous defense of the need for a solid training in the liberal arts in order to actually become wise and succeed in the ‘real world.’”

That's something an Italian humanist might write four hundred years later to champion “the new learning” of the day. And that is no coincidence. Le Goff contends that “a number of renaissances, more or less extensive, more or less triumphant,” took place throughout the medieval era -- an elaboration of the argument by the American historian Charles H. Haskins in The Renaissance of the Twelfth Century (1927), a book that influenced scholars without, as Le Goff notes, having much effect on the larger public. The Renaissance, in short, was a renaissance -- one of many -- and in Le Goff’s judgment “the last subperiod of a long Middle Ages.”

So, no bold, clean stroke of the black Magic Marker; more of a watercolor smear, with more than one color in the mix. Le Goff treats the Middle Ages as having a degree of objective reality, insofar as certain social, economic, religious and political arrangements emerged and developed in Europe over roughly a thousand years.

At the same time, he reminds us that the practice of breaking up history into periods has its own history -- deriving, in its European varieties, from Judeo-Christian ideas, and laden with ideas of decline or regeneration. “Not only is the image of a historical period liable to vary over time,” he writes, “it always represents a judgment of value with regard to sequences of events that are grouped together in one way rather than another.”

I'm not entirely clear how, or if, he reconciled the claim to assess periodization on strictly social-scientific grounds with its status as a concept with roots in the religious imagination, but it's a good book that leaves the reader with interesting questions.

Editorial Tags: 

Author and former college president offers advice to parents on the first year of college

Smart Title: 

How should parents prepare children for college? In a new book, a former college president takes a look at programs and resources at five different institutions to find out what students need and what parents should do during the first year of college.

Authors discuss reasoning behind high levels of Asian American achievement

Smart Title: 

Authors discuss new book on high levels of Asian-American achievement in education in which they argue that it's not about culture.

Review of Naomi Zack, "White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide"

You don’t hear much about the United States being a “postracial society” these days, except when someone is dismissing bygone illusions of the late ’00s, or just being sarcastic. With the Obama era beginning to wind down (as of this week, the president has just under 18 months left in office) American life is well into its post-post-racial phase.

Two thoughts: (1) Maybe we should retire the prefix. All it really conveys is that succession does not necessarily mean progress. (2) It is easy to confuse an attitude of cold sobriety about the pace and direction of change with cynicism, but they are different things. For one, cynicism is much easier to come by. (Often it’s just laziness pretending to be sophisticated.) Lucid assessment, on the other hand, is hard work and not for the faint of spirit.

Naomi Zack’s White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide (Rowman & Littlefield) is a case in point. It consists of three essays plus a preface and conclusion. Remarks by the author indicate it was prepared in the final weeks of last year, with the events in Ferguson, Mo., fresh in mind. But don’t let the title or the book’s relative brevity fool you. The author is a professor of philosophy at the University of Oregon -- and when she takes up terms such as “white privilege” or “black rights,” it is to scrutinize the concepts rather than to use them in slogans.

Despite its topicality, Zack’s book is less a commentary on recent events than part of her continuing effort to think, as a philosopher, about questions of race and justice that are long-standing, but also prone to flashing up, on occasion, with great urgency -- demanding a response, whether or not philosophers (or anyone else) is prepared to answer them.

Zack distinguishes between two ways of philosophizing about justice. One treats justice as an ideal that can be defined and reasoned about, even if no real society in human history ever “fully instantiates or realizes an ideal of justice for all members of that society.” Efforts to develop a theory of justice span the history of Western philosophy.

The other approach begins with injustice and seeks to understand and correct it. Of course, that implies that the philosopher already has some conception of what justice is -- which would seem to beg the question. But Zack contends that theories of justice also necessarily start out from pre-existing beliefs about what it is, which are then strengthened or revised as arguments unfold.

“However it may be done and whatever its subject,” Zack writes, “beginning with concrete injustice and ending with proposals for its correction is a very open-ended and indeterminate task. But it might be the main subject of justice about which people who focus on real life and history genuinely care.”

The philosopher Zack describes may not start out with a theory of what justice is. But that’s OK -- she can recognize justice, paradoxically enough, when it's gone missing.

I wish the author had clarified the approach in the book’s opening pages, rather than two-thirds of the way through, because it proves fundamental to almost everything else she says. She points out how police killings of young, unarmed African-American males over the past couple of years are often explained with references to “white privilege” and “the white supremacist system” -- examples of a sort of ad hoc philosophizing about racial injustice in the United States, but inadequate ones in Zack’s analysis.

Take the ability to walk around talking on the phone carrying a box of Skittles. It is not a “privilege” that white people enjoy, as should be obvious from the sheer absurdity of putting it that way. It is one of countless activities that a white person can pursue without even having to think about it. “That is,” Zack writes, “a ‘privilege’ whites are said to have is sometimes a right belonging to both whites and nonwhites that is violated when nonwhites are the ones who [exercise] it.”

In the words of an online comment the author quotes, “Not fearing the police will kill your child for no reason isn’t a privilege, it’s a right.” The distinction is more than semantic. What Zack calls “the discourse of white privilege” not only describes reality badly but fosters a kind of moral masochism, inducing “self-paralysis in the face of its stated goals of equality.” (She implies that white academics are particularly susceptible to "hold[ing] … progressive belief structures in intellectual parts of their life that are insulated from how they act politically and privately …")

Likewise, “the white supremacist power structure” is a way of describing and explaining oppression that is ultimately incapacitating: “After the civil rights movement, overt and deliberate discrimination in education, housing and employment were made illegal and explicitly racially discriminatory laws were prohibited.” While “de facto racial discrimination is highly prevalent in desirable forms of education, housing and employment,” it does no one any good to assume that “an officially approved ideology of white supremacy” remains embodied in the existing legal order.

None of which should be taken to imply that Zack denies the existence of deep, persisting and tenacious racial inequality, expressed and reinforced through routine practices of violence and humiliation by police seldom held accountable for their actions. But, she says, "What many critics may correctly perceive as societywide and historically deep antiblack racism in the United States does not have to be thoroughly corrected before the immediate issue of police killings of unarmed young black men can be addressed."

She is not a political strategist; her analyses of the bogus logic by which racial profiling and police killings are rationalized are interesting but how to translate them into action is not exactly clear. But in the end, justice and injustice are not problems for philosophers alone.

Editorial Tags: 

In new book, faculty member urges universities to hold themselves to higher levels of accountability and inclusivity

Smart Title: 

Author discusses new book in which he argues that institutions can hold themselves to higher levels of accountability and inclusivity.

Chegg to run Bowdoin College textbook center

Smart Title: 

Fewer and fewer students are buying their textbooks at the Bowdoin College bookstore, so the college is outsourcing its textbook center to Chegg.

New book proposes teaching-intensive tenure-track model to address 'real' crisis in the humanities

Smart Title: 

New book proposes teaching-intensive tenure track to address what it calls the "real" crisis in the humanities.


Subscribe to RSS - Books
Back to Top