Review of Bill Maurer, 'How Would You Like to Pay? How Technology is Changing the Future of Money'

Five years ago this month, a consortium of major telecommunications carriers (AT&T, T-Mobile and Verizon) announced that it was developing a new application that would enable customers to pay for goods and services using their smartphones. This “mobile wallet,” as such technology is commonly called now, would make credit card and debit account information available to merchants by wireless.

Other enterprises, including banks and American Express, soon joined the partnership. The application seemed well positioned to enter the market for hyperconvenient consumerism, even to dominate it. But things did not work out that way. A demo during the plenary session of a major conference on new payment technologies in 2013 went badly. Consumers complained that the app’s “setup and payment processes were cumbersome and frustrating,” in the words of Chris Welch in The Verge. But those were minor scratches compared to the self-inflicted fatality of the app’s name: Isis. It gets worse. A gift card with the words “serve ISIS” was circulating even after the product’s name was changed to Softcard in 2014.

“Probably few consumers even knew of its existence until the media bump it received from its rebranding,” Bill Maurer notes in How Would You Like to Pay? How Technology Is Changing the Future of Money (Duke University Press). That bump was clearly not enough: Softcard shut down early this spring. At the same time, the range of mobile wallets on sale has only been increasing. The information technology research firm Gartner estimated the value of mobile payments for 2012 was $163 billion worldwide and anticipates it will reach $720 billion by 2017.

“This is definitely an ecosystem in flux,” one business and technology columnist wrote last month, “partly because there are so many players offering so many different solutions -- and so many questions about compatibility and security.”

“Ecosystem” seems an interesting choice of words, in this context. Maurer, an anthropologist who is also dean of the school of social sciences at the University of California at Irvine, also uses it -- but in a much thicker sense than as a synonym, more or less, for “market.” The smartphone wallet represents only one means of mobile payment, limited mainly to the world’s more prosperous sectors. It’s in the poorer countries of the global South that mobile payment (using phones with text-messaging capabilities and maybe a little built-in flashlight) looms as a much larger part of everyday life: an economic and social link between urban and rural areas.

In Kenya, the M-Pesa mobile payment service launched in 2007, and within three years, more people were using it than had bank accounts. Over half of the country’s households had adopted it by 2011, and Maurer writes that M-Pesa “processed in that year more transactions within Kenya than Western Union had done globally.”

The contrasting fortunes of Isis/Softcard and M-Pesa (where M stands for “mobile” and “pesa” is the Swahili word for money) are striking; how well each met the demands of the people using them obviously differed significantly. But Maurer’s interest runs deeper than the great disparities between the respective societies.

We’re prone to think of money as a medium of exchange originally created to get around the vagaries of barter (e.g., it’s hard to make change for a goat) and also as a tool notoriously indifferent to how it’s used. With $10,000, you can furnish your apartment or hire a contract killer. Money itself, so understood, is both fungible and morally inert. And from that perspective of money, the recent technological innovations in how it can be transferred from one person or place to another are significant chiefly for whatever changes are made in speed, ease or degree of anonymity of the exchange.

Maurer’s subtitle seems to promise speculation on how money will change, but his stress on the idea of payment (or, better, payment systems) has a decidedly retrospective component. In the abstract, the value of $10,000 in cash is the same as that of $10,000 in diamonds, bitcoins or traveler's checks. Each can be used as a form of money, for exchange.

But in practice, different kinds of social infrastructure are involved in making the transaction feasible -- or even possible -- with considerable implications about the relationships among the people involved. I have not made the experiment, but I doubt you can buy furniture using diamonds, and paying a hit man with a money order seems like a bad idea.

At some point, bitcoins might have the nearly universal acceptance that cash now does; both are fungible and, in principle, anonymous. But those qualities do not inhere in the paper or digital currency themselves: each is part of a payment system, without which it would be worthless. The same is true of credit cards, of course, or smartphones-turned-wallets.

Money of whatever sort is an “index,” the author says, of “relationships of obligation, rank, clientage, social belonging or state sanction.” Furthermore, old payment systems don’t necessarily die off; more than one can be operating in a given society at the same time. Maurer describes the interesting and intricate ways long-distance charge cards have become integrated into African economies where cash and barter also have a place. Aware of the fantastically destructive effects of the last financial crisis, he is clearly concerned that the advantages of being integrated into the global economy could be wiped out in the long term, through no fault of the continent's mobile users themselves.

In the end, How Would You Like to Pay? is of interest less for what it says about the future (the author makes no predictions -- which, given the Isis debacle, seems prudent) than for how it encourages the reader to pay attention to nuances of the present. It’s a primer of the anthropological imagination -- and a reminder that money is too important a matter to leave to the economists.

Editorial Tags: 

Michael Griffin, 'Islamic State: Rewriting History' and Patrick Cockburn, 'The Rise of Islamic State'

Between the Boston Marathon bombing and too many spree killings by heavily armed men with grievances to keep count, we’ve all had plenty of recent experience with 24-hour coverage of horrific events such as the Paris massacre last Friday. There seem to be two major ways to manage attention. They don’t exhaust the possibilities but definitely mark the limits of a not especially broad spectrum.

One option is to keep track of breaking developments more or less in real time -- in extreme cases, checking for updates every few minutes. You expect the worst but try to get a jump on it, somehow, by absorbing each new crumb of pertinent information as it becomes available. The opposite extreme is to make like an ostrich and find some sand. Or at least to wait for fact, rumor and guesswork be sorted out. Only at that point does catching up make sense; until then, there’s more noise than signal.

By temperament I lean toward the first pattern: obsessive scanning. But not after Friday. Maybe the thought of the worst being yet to come was too much to handle. In any event, I opted for burying my head in a couple of recent books, starting with Michael Griffin’s Islamic State: Rewriting History, published this month by Pluto Press and distributed by the University of Chicago Press. The other, The Rise of Islamic State: Isis and the New Sunni Revolution by Patrick Cockburn, was released by Verso earlier this year. Neither can be recommended to anyone whose nerves are easily jarred. But they give a much thicker account of the group that inspired the attacks than nonspecialists can piece together from news reports over the past couple of years.

The authors seem to have turned in the manuscripts to their respective publishers around this time in 2014. At that point, the potential for the new “caliphate” to inspire terrorism beyond the Middle East was a less pressing issue than its unprecedented arrival as a force in the region.

Cockburn’s book, which incorporates his reporting on the Islamic State in Iraq and Syria for the London Review of Books, vividly conveys the speed and range of the group’s expansion and consolidation: ISIS, “as though intoxicated by its own triumphs,” proved capable of “defeating its enemies with ease even when they were more numerous and better equipped.” From one of a number of “nonstate actors” in the region, ISIS transformed into something that, beyond simply proclaiming itself as the Islamic State, effectively dissolved the border between Iraq and Syria and imposed its own religious and military authority over “an area larger than Great Britain and inhabited by some six million people -- a population larger than that of Denmark, Finland or Ireland.”

At the same time, ISIS remains, if not invulnerable to air strikes, then certainly prepared for them. It “operates as a guerrilla army,” Cockburn says, “without easily visible movements of personnel or equipment that can be targeted. Its leadership is well practiced at keeping out of sight.”

But it’s Griffin’s book that actually tells in detail the story of where ISIS came from and how it transformed over time. The author is a political analyst and commentator for BBC World and Al Jazeera. He draws almost entirely on English-language publications, in contrast to Cockburn, who quotes an array of friends, interview subjects and bits of popular culture from around the Middle East. But Griffin integrates his sources to good effect. He traces the growth of ISIS out of what had been Abu Musab al-Zarqawi’s organization Al Qaeda in Iraq -- a group that had managed to alienate both Osama bin Laden and Iraqi insurgents fighting the U.S. occupation. The death of Zarqawi in 2006 seems to have created less of a power vacuum than an opening for more capable strategists to assert themselves.

They learned to adapt to and exploit specific local and tribal concerns while building up both an effective economic infrastructure and formidable propaganda skills, taking advantage of the new-media skills of European-born jihadis who joined them. The ISIS cadre were also exceptionally lucky -- astonishingly, uncannily so -- about getting hold of new weaponry and tools. The psychological impact of the fall of Mosul in June 2014 was magnified by the sight of ISIS fighters “speeding towards outflanked enemies in hundreds of looted Humvees, bristling with assault rifles and rocket launchers.”

Plus the jihadis had “a fleet of white Toyota Tacoma pickups, double cabbed with mounted machine guns.” The vehicles, custom-made for U.S. Special Forces, were only available from a Toyota assembly plant in Texas. “How they managed to reach the frontiers of the caliphate,” Griffin says, “is anyone’s guess.”

Reading these books quickly was difficult, and the marginal notes and highlights I made along the way are evidence of how much more time it would take to grapple with them -- especially with regard to the authors’ differing understandings of the Arab Spring, and of the Syrian uprising in particular.

What they concur on, and no surprise, is the emerging status quo, with the Islamic State obliging the United States and Iran to act as allies for the foreseeable future, despite the saber-rattling impulses toward one another. The situation was paradoxical and hard to imagine as stable even before the terrorist attacks of last week. Is there even a word for how things stand now?

Editorial Tags: 

Author discusses new book arguing that China's universities are rising while America's decline

Smart Title: 

Author discusses new book arguing that China's universities are rising while America's are in decline.

Review of Christophe Charle's 'Birth of the Intellectuals 1880-1900'

Someone once characterized the intellectual as a person “living by, for and off of ideas.” Another remark in the same terse vein calls the intellectual someone who habitually reads with pen in hand.

Neither definition would pass muster with historians or sociologists, but they are ideal for ordinary usage. Each takes the distinguishing feature of the intellectual to be certain ingrained ways of directing the attention -- without making any claim about his or her personal qualities or social status. This has the advantage of keeping snarling to a minimum. (No other noun provokes so much insecurity and hostility that people often feel compelled to put “pseudo-” or “self-described” in front of it.)

The trouble with such placid or neutral definitions, however, is that “les intellectuels” were born in a scene of great hostility and named amid a prolonged exchange of insults. Christophe Charle’s Birth of the Intellectuals 1880-1900 (Polity) uses the French word when referring to the academics and authors who rallied to the defense of Alfred Dreyfus in the late 1890s and the English word “when it is to be understood in the broader sociological sense.” (Charle is a professor of contemporary history at the Université Paris 1 Panthéon-Sorbonne.) The distinction is useful, but the thrust of book’s argument is that the persisting ambiguities and disputes over the concept -- even within sociology -- were already taking shape before the intellectuels rallied behind Émile Zola’s manifesto J’accuse in 1898.

That the author is a historian would not necessarily be a reader’s first guess. Charle draws so heavily on the ideas and perspectives of Pierre Bourdieu that I assumed he was a sociologist using educational and literary developments in France in the late 19th century for a case study of the field of cultural production. Anyone not already acquainted with the Dreyfus controversy, at least in broad outline, is bound to give up quickly: this is historical writing in which the narrative element approaches the vanishing point.

Instead of events, Charle reconstructs the categories and niches of brainwork during the late 19th century as each established its role in French society. At the same time, each was also marking off its own respective criteria for recognition and advancement. Republican principles, established by a century of revolutions, did not preclude the emergence of an elite -- but it had to be based on merit rather than bloodline, a “nonexclusive aristocracy” cultivated by the educational system.

Louis Pasteur served as an exemplary case: a man of modest origins, he had contributed to the well-being of humanity (the rabies and anthrax vaccines) and advanced knowledge (the germ theory) while bolstering the French economy with his method for keeping wine and milk from going sour. He was the patient, methodical laboratory researcher as national hero: his expertise possessed a recognizable social value.

Supplementing the tremendous prestige of the sciences was the model of specialized, highly professionalized scholarship in all disciplines practiced in German universities. So teaching and research could be understood and valued -- by those who practiced them and by laypeople alike -- as matters of public importance. That was true even when scientists and scholars prided themselves on remaining so concentrated on their areas of specialization that they ignored everything else.

The situation among novelist, poets, essayists and other writers was altogether murkier -- in part because of the rapid and chaotic nature of the publishing industry, especially given its susceptibility to economic pressures. The number of aspirants always exceeds the number of positions offering a writer access to readers or money (much less, as in the best case, both). And the range of available outlets for publication tends to interact with writers’ own interests, styles and degrees of mutual hostility in fairly volatile ways. Whether new literary movements create new literary journals or vice versa can only be determined on a case-by-case basis; even then, it will be partly guesswork.

The term “intellectuel seems to have been coined in France in the early 1890s, in the small but serious journals of debate written and edited by, well, intellectuals -- that is, writers, academic and otherwise, who expressed political and cultural opinions largely critical of the established order. The word is often said to have entered English as a neologism in the wake of the Dreyfus case. (Charle seems to make the same assumption, although Stefan Collini quotes Byron using it as early as 1813.)

In any event, the intellectuels who intervened to defend Dreyfus -- accusing the French military of anti-Semitism and of covering up evidence that would exonerate him -- were drawn from the ranks of the professoriate as well as writers, both creative and journalistic, established and otherwise. Charle goes over their social backgrounds, career trajectories and political affiliations exhaustively. In analyzing the statements they wrote and published, he pays close attention to how famous names and distinguished institutional affiliations were sometimes featured prominently to signal their authority and the seriousness of the cause. At other times, the indicators or prominence were downplayed: the list of signatures might have an illustrious professor alongside an obscure poet or an ordinary citizen.

The debate over Dreyfus quickly spun off what sounds like an equally nasty one over whether the intellectuels were heirs of Voltaire’s role as the voice of reason and justice against oppression, or just people interfering in military matters for which their education and verbal skills gave them no claim to competence. Because of its challenge to authority and the involvement of many figures with known anarchist or socialist tendencies, the pro-Dreyfus cause was largely understood as a movement of the left -- which inspired the anti-Dreyfusards to come up with accusations that they were radical elitist hypocrites. (A bit rich, given that those denouncing Dreyfus as a traitor included people who wanted to restore the monarchy.)

As for Dreyfus, he was exonerated a few years later. The notion that intellectuals can, and should, play some role as critics of the society they live in was established. Debate over how well they perform that function never ends, nor should it. And the snarling, of course, continues unabated.

Editorial Tags: 

U of Michigan Press endorses accessible book publishing guidelines

Smart Title: 

The disability studies scholars behind guidelines on accessibility in publishing gain their first endorsement from a university press.

Review of 'Philip Sparrow Tells All'

A novelist and English professor named Samuel M. Steward was fired by DePaul University in 1956 for the offense of running a tattoo parlor on Chicago’s Skid Row. He did not have the option (so readily taken for granted these days) of explaining it as full-immersion ethnographic research, nor did the fact that he’d practiced this sideline under a pseudonym, Philip Sparrow, count as mitigation. By then Steward was in his late 40s and had been teaching for well over 20 years, but his academic career was finished.

It was a moment of emergence, however, not of decline. Within a few years, the defrocked professor moved to California. His artistry with the ink gun put Philip Sparrow in demand among the Hell’s Angels, whose standards are rigorous and exacting to a degree academe never quite manages. (Being thrown out of the Angels can include relinquishing one’s tattoos, a process that sometimes involves a blowtorch.)

In the late 1970s, he went back to using his given name and under it published, among other things, a collection of letters from his friends Gertrude Stein and Alice B. Toklas. But use of a pseudonym seems to have permitted the flourishing of aspects of his creative identity that might have gone unexpressed. Besides his renown among tat connoisseurs as Philip Sparrow, he also wrote a considerable amount of pornographic fiction under the name Phil Andros -- which was kind of a meta pun, splitting the Greek components of “philanderer” (a man who has sex with a great many women) and repurposing them for gay use (“lover of men”).

He died in 1993 at the age of 84, leaving behind the papers that allowed Justin Spring to write Secret Historian: The Life and Times of Samuel Steward (Farrar, Straus and Giroux), a National Book Award Finalist for 2010. The big online booksellers show the fiction of Phil Andros to be available and in demand, although nearly everything Steward published under his own name has long since gone out of print. But the University of Chicago Press has now added something to his stature as an author by publishing Philip Sparrow Tells All: Lost Essays by Samuel Steward, Writer, Professor, Tattoo Artist, edited by Jeremy Mulderig -- who, by a nice bit of karma, happens to be an associate professor emeritus of English at DePaul University.

Occasionally a book’s subtitle all but defies the reader not to have a look, and in this case the photos of the author on the front cover alone are pretty striking, given Steward's resemblance to John Waters. The contents of the volume are selected from the column Steward wrote, under the Philip Sparrow byline, for the Illinois Dental Journal between 1944 and 1949.

So the publisher’s description says: I did not make it up, nor could I. While Justin Spring’s biography of Steward from five years ago had been widely and well reviewed, I had not heard about it, and so I suspected, for a moment, that Philip Sparrow Tells All was a prank, either by the University of Chicago Press or on it. The essays of a tattoo artist recovered from 70-year-old issues of the Illinois Dental Journal? Come on.

Exercising due diligence, I learned just enough to confirm that the author actually existed -- then decided to stop reading more about him. First, I wanted to read some of the essays themselves. The world is full of colorful characters who try to write, but eccentricity and adventurousness are not, in practice, qualifications for authorship. (To their credit they sometimes recognize this and offer to tell a writer their stories, in exchange for a share of the advance.) So I skipped the book’s introductory matter and the headnotes the editor had prepared for each piece and went right to Steward’s own prose.

The first selection, his inaugural column, was indeed written with the publication’s audience in mind: “The Victim’s Viewpoint: On Sublimated Sadism; or, the Dentist as Iago.” The tone or manner is approximately that of Robert Benchley:

“We have opened our mouth widely for many of these torturers, from Maine to Montana, and we are ready to swear that on more than one occasion -- as we have been approached, lying there helpless and trembling -- we have seen a diabolic gleam in their eyes as they reached for their tools. There is one certain prober, doubtless invented by Beelzebub, which they use when they begin their preliminary surveying. It is shaped vaguely like a sophisticated corkscrew, and is evidently intended to search out the secret places of one’s heart; we personally have felt it go even lower, and are sure it once left a scar on our right kidney. … but let us draw a curtain over this painful scene; even in thinking of it we have worked ourselves into a cold dribble.”

Something like this essay probably appeared at in every college humor magazine in the country at least once per issue for a decade on either side of its January 1944 publication date. It seems well-enough done for what it is; the best that might be said for it is that the author cleared his throat.

An abrupt shift in topic and style comes with the following piece, “On Cryptography,” published that October -- a sprightly introduction to a matter of great wartime interest. The title sounds like an allusion to the essays of Montaigne, and where the Iago reference in his debut seemed arch and contrived, here Steward’s use of classical and contemporary references (e.g., calling Suetonius “the Walter Winchell of ancient Rome”) proves both humorous and apropos. The next month’s column “On Alcoholics Anonymous” -- explaining the principles of an organization just beginning to catch the public’s attention -- comes about as close to memoir as possible without violating the distance implied by the authorial “we.”

It’s a remarkable progression in just three essays, and it doesn’t end there. With the measure of safety provided by a pseudonym -- and also by the less-than-mass circulation of the Illinois Dental Journal -- Steward experimented with the comic, personal and confessional modes of the casual essay in ways that might have been difficult to risk otherwise.

After sampling enough of the book to determine that the columns were of interest in their own right, rather than as the supplement to the author’s biography, I started reading Jeremy Mulderig’s introductory material. It clarifies a great deal, beginning with the essayist’s venue: Steward was attracted to his dentist, who happened to be the magazine’s editor. Its more typical fare was articles with titles like “Important Considerations in Porcelain Veneer Restoration,” but a column written from a nonprofessional vantage point seemed worthwhile, if only for variety. The dentist accepted Steward’s offer to write for the journal, though not, it seems, his other propositions.

After writing several pieces for “The Victim’s Viewpoint” (the column’s title for most of 1944), Steward decided to reboot it as something more wide-ranging. Which explains the nine-month gap between the first and second selections in Philip Sparrow Tells All, and the marked change in the writing itself. Including just one piece from the column’s beta version seems like a wise choice on Mulderig’s part. The wit and whimsy of dentistry as seen from the patient’s-eye view must have been stretched pretty thin after a couple of months.

Many of the columns take on a more humorous quality when you know that the author had a sex life active enough to impress Alfred Kinsey. And no doubt that will be a selling point for the book. But the tension between overt statement and implicit meaning can have effects other than amusement, and in the case of one essay, that tension seems especially powerful.

Published in February 1945, it anticipates the difficulties ahead as American society tries to reabsorb returning servicemen (and vice versa). Here is one passage:

“Only those who have been shot at can love and understand each other. We at home can never comprehend the powerful fraternalism that unites the men who belong, by reason of their experiences, to the ghostly brotherhood of war. When death is behind a bush that trembles, when it explodes in burning phosphorous to kill the friend who was joking a moment before, when it surrounds you with bodies black with flies and bloated by the sun until they at last explode, when your foot slides upon the stinking decayed intestines of a thing that was once a man -- only then, after the bony fingers have inscribed the membership card with your name, and you have looked into the fearful emptiness of the sockets in a fleshless skull, are your dues paid and you yourself a member of the League of War. … They have their own code of morals which we cannot possibly understand, and which will baffle and dismay us utterly. They will be startled and chagrined by what they will consider our indifference, but is really only our own inexperience slowly woven around us in our geographically and emotionally isolated chrysalis.”

Meaningful enough as these lines are on the most manifest level, they take on even more significance in the light of Alan Bérubé’s Coming Out Under Fire: The History of Gay Men and Women in World War Two (Free Press, 1990). Bérubé showed how important the experience of the war was to the formation of a sense of gay identity and community in the United States.

Steward himself was a Naval enlistee but did not see combat. There is an ambivalence, intimacy, pain and sadness to the essay that can be felt by a reader who knows nothing about the author. But it seems clear that the traumatized fighting men he depicts weren’t sociological abstractions but friends and lovers.

It bears reiterating that the name under which he published the essay, Philip Sparrow, was the one he later used as a tattoo artist -- and the one he preferred to go by for some while after being expelled from the groves of academe. It was the identity he assumed at the limits of permissible expression. “Man is least himself when he talks in his own person,” wrote Oscar Wilde. “Give him a mask, and he will tell you the truth.”

Editorial Tags: 

Commentary on Heidegger's Black Notebooks

Every few years, somebody notices that Martin Heidegger was a Nazi -- and it all starts up again: the polemics, the professions of shock, the critiques of his philosophy’s insidious role in the humanities. At times the denunciations have a rather generic quality, as if a search-and-replace macro had been used to repurpose a diatribe again John Dewey or Jacques Derrida. Calls for a boycott of Heidegger’s writings are made, issued by people who cannot name two of them.

The Heidegger bashers tend to be the loudest, but there are counterdemonstrators. Besides the occasional halfhearted search for mitigating circumstances (the Weimar Republic did not make for clear thinking, after all, and the man’s thought was normally pitched at stratospheric levels of abstraction rather than the grubby domain of party politics) there is the sport of itemizing the anti-Heideggerians’ lapses in scholarship. Every line of argument on either side of the dispute was established during the controversy provoked by Victor Farias’s Heidegger and Nazism (1987), yet l’affaire Heidegger has been recycled on at least three or four occasions since then. It’s as if the shock of the scandal was so great that it induced amnesia each time.

The most recent episode (Heidegger Scandal 5.0?) followed the publication in Germany, last year, of the first batch of the philosophical and political musings that Heidegger began recording in notebooks from 1931 onward. An English translation is forthcoming, so count on the outrage to renew in about six months. In the meantime, let me recommend a sharp and succinct overview of the Heidegger matter that may be of interest to anyone who hasn’t caught the earlier reruns. It appeared in the interdisciplinary journal Science & Society under the title “Notes on Philosophy in Nazi Germany.” The author, V. J. McGill, was for many years a professor of philosophy and psychology at Hunter College. “In the midst of the disillusionment and insecurity of postwar Germany and emerging fascism,” he wrote:

“Heidegger saw in care (Sorge) and anxiety (Angst), the basic substance of man. Instead of offering a rational solution of some kind he devoted himself to fine-spun philological distinctions, to an analysis of the pivotal concept of ‘nothing’ and to a subtle exploration of ‘death’ of which he says that we experience it only in the mode of ‘beside’ -- that is, beside death. History, culture, freedom, progress are illusory. He finds our salvation in the recovery of a primordial sense of coexistence with other beings, that is, a bare feeling of togetherness, deprived of all the amenities and hopes which make social life worth while ….

“The hundreds who flocked to Heidegger's very popular lectures in Freiburg learned that anxiety is the final, irremedial substance of man, and left with such esoteric wisdom as their main reward. Heidegger's philosophy was not distasteful to the Nazis, and when he was made rector of the University of Freiburg, he gave an address praising the new life which animated German universities. In recent years a rift has occurred. But philosophers can fall out with the Nazis on other grounds than their ideas and doctrines.”

McGill’s article was published in 1940. Over the intervening three quarters of a century, additional details have emerged, including documentation that Heidegger was not just an ally of the Nazi Party but also a full member from 1933 to 1945. And interest in his work on the part of several generations of philosophers who never showed the slightest bent towards fascism has meant much debate over the validity of reducing Heidegger’s philosophical concepts to their political context. But for all the anger that simmers in McGill’s discussion of Heidegger as an academic lackey of the Third Reich, his account is matter-of-fact and nonsensationalist, and little of the recent commentary can be said to improve upon it.

The Black Notebooks, as Heidegger’s private ruminations are known, sound ghastly on a number of fronts. The volumes published so far cover the years 1931 through 1941. Those covering the rest of the war years are being edited, and Heidegger is reported to have continued keeping the notebooks until his death in 1976. Richard Polt, a professor of philosophy at Xavier University and a translator of Heidegger’s work, identifies 19 passages (out of about 1,200 pages) that attack Jews in terms that might as well have come from an editorial by Joseph Goebbels. After the war Heidegger claimed to have become disillusioned with the Nazis within a couple of years of joining the party -- and the notebooks show this to have been true, strictly speaking. But his objections were to the boorishness and careerism of men who didn’t share his lofty understanding of Hitler’s ideology.

As with the anti-Semitism, this does not come as a revelation, exactly. His reference to “the inner truth and greatness of National Socialism” in a lecture from 1935 remained in the text when he published it in 1953. Beyond defiant, it was a gesture indicating a certain megalomania: Heidegger hadn’t betrayed the Fuhrer’s vision, the Nazis had!

But as David Farrell Krell, a professor emeritus of philosophy at DePaul University, suggests in a recent issue of Research in Phenomenology, the Black Notebooks reveal not just disappointment with the regime (combined with perfect callousness towards its brutality) but levels of rage, bile and despair that keep him from thinking. Heidegger cannot challenge himself, only repeat himself. “From day to day and day after day,” Krell says, Heidegger “entirely forgets that he has written the same things over and over again with the identical amount of dudgeon.”

Heidegger loathed Freud and psychoanalysis, which only makes it tempting to subject him to a little armchair diagnostics. But Krell's point, if I understand him correctly, is that the repetitiveness is more than symptomatic; the Black Notebooks document Heidegger not as a philosopher seduced by totalitarian politics, but as someone who has quit blazing a pathway of thought and instead become trapped in a maze of his own fabrication. Unfortunately, he is not the only one so trapped:

“At least part of the allure of the ongoing Heidegger scandal,” writes Krell in a passage that lights up the sky, “is that it distracts us from our own appalling national stupidities and our galling national avarice -- our own little darkenings, if you will. It is so much easier to fight battles that have already been decided and so lovely to feel oneself securely moored in the harbor of god’s own country. Not the last god but the good old reliable one, who blesses every stupidity and earns interest on every act of avarice. … The irony is that Heidegger’s Notebooks themselves reflect this dire mood. Perhaps by condemning him and them, we hope to gain a bit of space for ourselves, some impossible space for ourselves? That is the shadow these Notebooks cast over those who are so anxious to condemn. And that would be the Notebooks’ most terrible victory: it is not that the last laugh laughs best, for there is no joy and no laughter in them, but that their helpless rage recurs precisely in those who rail against them.”

Remember that next spring, when the controversy starts up once more.

Editorial Tags: 
Image Caption: 
Martin Heidegger

Review of Finn Brunton and Helen Nissenbaum, 'Obfuscation: A User's Guide for Privacy and Protest'

When downloading an app or approving a software update, I now usually hesitate for a moment to consider something the comedian John Oliver said early this summer: a software company could include the entire text of Mein Kampf in the user agreement and people would still click the “agree” button.

“Hesitates” is the wrong word for something that happens in a fraction of a second. It’s not as if I ever scrolled back through to make sure that, say, Microsoft is not declaring that it owns the copyright to everything written in OneNote or Word. The fine print goes on for miles, and anyway, a user agreement is typically an all-or-nothing proposition. Clicking “agree” is less a matter of trust than of resignation.

But then, that’s true about far more of life in the contemporary digital surround than most of us would ever want to consider. Every time you buy something online, place a cell phone call, send or receive a text message or email, or use a search engine (to make the list no longer nor more embarrassing than that), it is with a likelihood, verging on certainty, that the activity has been logged somewhere -- with varying degrees of detail and in ways that might render the information traceable directly back to you. The motives for gathering this data are diverse; so are the companies and agencies making use of it. An online bookseller tracks sales of The Anarchist Cookbook in order to remind customers that they might also want a copy of The Minimanual of the Urban Guerrilla, while the National Security Administration will presumably track the purchase with an eye to making correlations of a different sort.

At some level we all know such things are happening, probably without thinking about it any more often than strictly necessary. Harder to grasp is the sheer quantity and variety of the data we generate throughout the day -- much of it trivial, but providing, in aggregate, an unusually detailed map of what we do, who we know and what’s on our minds. Some sites and applications have “privacy settings,” of course, which affect the totality of the digital environment about as much as a thermostat does the weather.

To be a full-fledged participant in 21st-century society means existing perpetually in a state of information asymmetry, in the sense described by Finn Brunton and Helen Nissenbaum in Obfuscation: A User’s Guide for Privacy and Protest (MIT Press). You don’t have to like it, but you do have to live with it. The authors (who teach media culture and communications at New York University, where they are assistant professor and professor, respectively) use the term “obfuscation” to identify various means of leveling the playing field, but first it’s necessary to get a handle on information asymmetry itself.

For one thing, it is distinct from the economic concept of asymmetrical information. The latter applies to “a situation in which one party in a transaction has more or superior information compared to another.” (So I find it explained on a number of websites ranging from the scholarly to the very sketchy indeed.) The informed party has an advantage, however temporary; the best the uninformed can do is to end up poorer but wiser.

By contrast, what Brunton and Nissenbaum call information asymmetry is something much more entrenched, persistent and particular to life in the era of Big Data. It occurs, they explain, “when data about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand.” It has an economic aspect, but the implications of information asymmetry are much broader.

“Our data will be shared, bought, sold, analyzed and applied, all of which will have consequences for our lives,” the authors write. “Will you get a loan, or an apartment, for which you applied? How much of an insurance risk or a credit risk are you? What guides the advertising you receive? How do so many companies and services know that you’re pregnant, or struggling with an addiction, or planning to change jobs? Why do different cohorts, different populations and different neighborhoods receive different allocations of resources? Are you going to be, as the sinister phrase of our current moment of data-driven antiterrorism has it, ‘on a list’?”

Furthermore (and here Brunton and Nissenbaum’s calm, sober manner can just barely keep things from looking like one of Philip K. Dick’s dystopian novels), we have no way to anticipate the possible future uses of the galaxies of personal data accumulating by the terabyte per millisecond. The recent series Mr. Robot imagined a hacker revolution in which all the information related to personal debt was encrypted so thoroughly that no creditor would ever have access to it again. Short of that happening, obfuscation may be the most practical response to an asymmetry that’s only bound to deepen with time.

A more appealing word for it will probably catch on at some point, but for now “obfuscation” names a range of techniques and principles created to make personal data harder to collect, less revealing and more difficult to analyze. The crudest forms involve deception -- providing false information when signing up with a social media site, for example. A more involved and prank-like approach would be to generate a flood of “personal” information, some of it true and some of it expressing one’s sense of humor, as with the guy who loaded up his Facebook profile with so many jobs, marriages, relocations, interests and so on that the interest-targeting algorithms must have had nervous breakdowns.

There are programs that will click through on every advertisement that appears as you browse a site (without, of course, bothering you with the details) or enter search engine terms on topics that you have no interest in, thereby clouding your real searches in a fog of irrelevancies.

The cumulative effect would be to pollute the data enough to make tracking and scrutiny more difficult, if not impossible. Obfuscation raises a host of ethical and political issues (in fact the authors devote most of their book to encouraging potential obfuscators to think about them) as well as any number of questions about how effective the strategy might be. We’ll come back to this stimulating and possibly disruptive little volume in weeks to come, since the issues it engages appear in other new and recent titles. In the meantime, here is a link to an earlier column on a book by one of the co-authors that still strikes me as very interesting and, alas, all too pertinent.

Editorial Tags: 

Review of Jacques Le Goff, 'Must We Divide History Into Periods?'

George Orwell opened one of his broadcasts on the BBC in the early 1940s by recounting how he’d learned history in his school days. The past, as his teachers depicted it, was “a sort of long scroll with thick black lines ruled across it at intervals,” he said. “Each of these lines marked the end of what was called a ‘period,’ and you were given to understand that what came afterwards was completely different from what had gone before.”

The thick black lines were like borders between countries that didn’t know one another’s languages. “For instance,” he explained, “in 1499 you were still in the Middle Ages, with knights in plate armour riding at one another with long lances, and then suddenly the clock struck 1500, and you were in something called the Renaissance, and everyone wore ruffs and doublets and was busy robbing treasure ships on the Spanish Main. There was another very thick black line drawn at the year 1700. After that it was the Eighteenth Century, and people suddenly stopped being Cavaliers and Roundheads and became extra-ordinarily elegant gentlemen in knee breeches and three-cornered hats … The whole of history was like that in my mind -- a series of completely different periods changing abruptly at the end of a century, or at any rate at some sharply defined date.”

His complaint was that chopping up history and simplistically labeling the pieces was a terrible way to teach the subject. It is a sentiment one can share now only up to a point. Orwell had been an average student; today it would be the mark of a successful American school district if the average student knew that the Renaissance came after the Middle Ages, much less that it started around 1500. (A thick black line separates his day and ours, drawn at 1950, when television sales started to boom.)

Besides, the disadvantages of possessing a schematic or clichéd notion of history are small by contrast to the pleasure that may come later, from learning that the past was richer (and the borders between periods more porous) than the scroll made it appear.

Must We Divide History Into Periods? asked Jacques Le Goff in the title of his last book, published in France shortly before his death in April 2014 and translated by M. B. DeBevoise for the European Perspectives series from Columbia University Press. A director of studies at L'École des Hautes Études en Sciences Sociales in Paris, Le Goff was a prolific and influential historian with a particular interest in medieval European cities. He belonged to the Annales school of historians, which focused on social, economic and political developments over very long spans of time -- although his work also exhibits a close interest in medieval art, literature and philosophy (where changes were slow by modern standards, but faster than those in, say, agricultural technique).

Le Goff’s final book revisits ideas from his earlier work, but in a manner of relaxed erudition clearly meant to address people whose sense of the past is roughly that of young Orwell. And in fact it is that heavy mark on the scroll at the year 1500 -- the break between the Middle Ages and the Renaissance -- that Le Goff especially wants the student to reconsider. (I say “student” rather than “reader” because time with the book feels like sitting in a lecture hall with a memorably gifted teacher.)

He quotes one recent historian who draws the line a little earlier, with the voyage of Christopher Columbus in 1492: “The Middle Ages ended, the modern age dawned, and the world became suddenly larger.” Le Goff is not interested in the date but in the stark contrast that is always implied. Usually the Middle Ages are figured as “a period of obscurity whose outstanding characteristic was ignorance” -- happily dispelled by a new appreciation for ancient, secular literature and a sudden flourishing of artistic genius.

Calling something “medieval” is never a compliment; the image that comes to mind is probably that of a witch trial. By contrast, “Renaissance” would more typically evoke a page from Leonardo da Vinci’s notebooks. Such invidious comparison is not hard to challenge. Witch trials were rare in the Middle Ages, while the Malleus Maleficarum appeared in “the late fifteenth century,” Le Goff notes, “when the Renaissance was already well underway, according to its advocates.”

Given his expertise on the medieval city -- with its unique institutional innovation, the university -- Le Goff makes quick work of demolishing the notion of the Middle Ages having a perpetually bovine and stagnant cultural life. The status of the artist as someone “inspired by the desire to create something beautiful” who had “devoted his life to doing just this” in pursuit of “something more than a trade, nearer to a destiny,” is recognized by the 13th century. And a passage from John of Salisbury describes the upheaval underway in the 12th century, under the influence of Aristotle:

“Novelty was introduced everywhere, with innovations in grammar, changes in dialectic, rhetoric declared irrelevant and the rules of previous teachers expelled from the very sanctuary of philosophy to make way for the promulgation of new systems …”

I can’t say that the name meant anything to me before now, but the entry on John of Salisbury in the Stanford Encyclopedia of Philosophy makes it sound as if Metalogicon (the work just quoted) was the original higher ed polemic. It was “ostensibly written as a defense of the study of logic, grammar and rhetoric against the charges of the pseudonymous Cornificius and his followers. There was probably not a single person named Cornificius; more likely John was personifying common arguments against the value of a liberal education. The Cornificians believe that training is not relevant because rhetorical ability and intellectual acumen are natural gifts, and the study of language and logic does not help one to understand the world. These people want to seem rather than to be wise. Above all, they want to parlay the appearance of wisdom into lucrative careers. John puts up a vigorous defense of the need for a solid training in the liberal arts in order to actually become wise and succeed in the ‘real world.’”

That's something an Italian humanist might write four hundred years later to champion “the new learning” of the day. And that is no coincidence. Le Goff contends that “a number of renaissances, more or less extensive, more or less triumphant,” took place throughout the medieval era -- an elaboration of the argument by the American historian Charles H. Haskins in The Renaissance of the Twelfth Century (1927), a book that influenced scholars without, as Le Goff notes, having much effect on the larger public. The Renaissance, in short, was a renaissance -- one of many -- and in Le Goff’s judgment “the last subperiod of a long Middle Ages.”

So, no bold, clean stroke of the black Magic Marker; more of a watercolor smear, with more than one color in the mix. Le Goff treats the Middle Ages as having a degree of objective reality, insofar as certain social, economic, religious and political arrangements emerged and developed in Europe over roughly a thousand years.

At the same time, he reminds us that the practice of breaking up history into periods has its own history -- deriving, in its European varieties, from Judeo-Christian ideas, and laden with ideas of decline or regeneration. “Not only is the image of a historical period liable to vary over time,” he writes, “it always represents a judgment of value with regard to sequences of events that are grouped together in one way rather than another.”

I'm not entirely clear how, or if, he reconciled the claim to assess periodization on strictly social-scientific grounds with its status as a concept with roots in the religious imagination, but it's a good book that leaves the reader with interesting questions.

Editorial Tags: 

Author and former college president offers advice to parents on the first year of college

Smart Title: 

How should parents prepare children for college? In a new book, a former college president takes a look at programs and resources at five different institutions to find out what students need and what parents should do during the first year of college.


Subscribe to RSS - Books
Back to Top