Political science

10 Years After

Even with the anniversary approaching, reading about 9/11 feels like a matter of duty, not desire. Especially with the anniversary, in fact: Magazines and PDF printouts devoted to the 10th anniversary of 9/11 accumulated on my desk for more than a week before I found the will to do more than stare at them. Eventually, the work ethic asserted itself, and this column will digest some of the recently published material on 9/11.

But this spell of hesitation bears mentioning, because a temporary failure of nerve was probably more than my idiosyncrasy. The event itself is hard to think about -- just as it was at the time. My recollection of that day has is not primarily one of fear, though there was plenty of that. (We live in Washington; according to a news report that morning, a car bomb had gone off downtown; this proved false but it stuck with you.) Rather, it was a state of extremely vivid confusion -- of being keenly aware of each passing hour, yet unable to take in the situation, let alone to anticipate very much of anything.

Who could? The experience was unprecedented. So much of the past decade of American life can be traced back to that day: wars, drones, security, surveillance, detention, “enhanced interrogation,” torture porn, the extremes of public emotion about having a president whose middle name is Hussein…. One thing that changed after 9/11 was that, after a while, people quit saying that “everything changed” on that date. But something did change, so that it is difficult to consider the way we live now without returning, sooner or later, to 9/11.

It is a date that names an era. Melvyn P. Leffler’s essay “9/11 in Retrospect,” appearing in Foreign Affairs, tries “to place the era in context and assess it as judiciously as possible.” That means from the perspective of an unexcitable centrism, with an eye to calculating the long-term effects on U.S. power. Leffler, a professor of history at the University of Virginia, is co-editor, with Jeffrey Legro, of In Uncertain Times: American Foreign Policy After the Berlin Wall and 9/11, published this summer by Cornell University Press. (At the time of this writing, his article is behind the journal’s paywall.)

Against those of us who believe that George W. Bush came into office with the intention of taking on Iraq, Leffler maintains that the administration was overwhelmingly preoccupied with domestic policy before 9/11 and improvised its doctrine of “anticipatory self defense [or] preventative warfare” out of “a feeling of responsibility for the public and a sense of guilt over having allowed the country to be struck.” In shifting gears, Bush and his advisers “had trouble weaving the elements of their policy into a coherent strategy that could address the challenges they considered most urgent.”

The combination of tax cuts and increased military expenditures “seriously eroded” the country’s “financial strength and flexibility,” even as occupations and counterinsurgencies undermined U.S. credibility as a force in the Middle East and Persian Gulf. “Iraq was largely eliminated as a counterbalance to Iran,” writes Leffler, “Iran’s ability to meddle beyond its borders increased, and the United States’ ability to mediate Israeli-Palestinian negotiations declined.” Meanwhile, “China’s growing military capability” began “endanger[ing] the United States’ supremacy in East and Southeast Asia” -- which was probably not high on Osama Bin Laden’s agenda, but history is all about the unexpected consequences.

The attacks on 9/11 “alerted the country to the fragility of its security,” Leffler concludes, as well as “the anger, bitterness, and resentment toward the United States residing elsewhere, particularly in parts of the Islamic world. But if 9/11 highlighted vulnerabilities, its aftermath illustrated how the mobilization of U.S. power, unless disciplined, calibrated, and done in conjunction with allies, has the potential to undermine the global commons as well as protect them.”

“It’s been a sad, lost, and enervating decade,” says the editorial note introducing the discussion of 9/11 in Democracy, a quarterly journal calling for “a vibrant and vital progressivism for the 21st century.” With contributions by 11 academics and journalists -- running to 35 pages of the fall issue -- there is too much to synopsize, but the title sums things up reasonably well: “America Astray.” (The symposium is currently posted online, in advance of the print edition.)

But two interventions stand out from the prevailing tone of frustration and worry. Being the gloomy sort myself, I want to emphasize them here, just to see what that’s like.

Elizabeth Anderson, who is a professor of philosophy and women’s studies at the University of Michigan at Ann Arbor, writes with evident disappointment that Bush’s legacy lives on: “Overall, Obama’s record on executive power and civil liberties diverges little from his predecessor. In certain respects it is even worse....” She refers to continued domestic spying, huge expenditures for the National Security Agency, the prosecution of leakers “on an unprecedented scale,” and Obama’s targeting of an American citizen, Anwar al-Awlaki, for “extrajudicial killing … even outside any battlefield context.”

“The traumatic experience of 9/11 lies behind all of these [actions and policies],” Anderson writes. But the revival of “public demand for privacy, civil liberties, and greater transparency is likely -- one hopes, anyway – to override the fears that underwrite state violations of constitutional rights.” The profound demographic shifts of the coming decades means that political parties “will soon see that they have more to gain by integrating immigrants and their American children into society than by pandering to anti-immigrant prejudice.”

Well, it’s hard to make predictions, especially about the future (as Yogi Berra said, or should have) but the notion of moving beyond the post-9/11 rut is certainly appealing. The other Democracy contributor to offer an encouraging word is Fawaz A. Gerges, the director of the Middle East Center at the London School of Economics, who recapitulates some of the argument from his book The Rise and Fall of Al-Qaeda, just published by Oxford University Press.

“The Arab Spring reinforced what many of us have known for a while,” he writes. “Al Qaeda’s core message is in conflict with the universal aspirations of the Arab world…. Bin Laden and his successor, Ayman al-Zawahiri, neither speak for the umma (the global Muslim community) nor exercise any influence on Arab public opinion.”

The organization has shrunk from three or four thousand fighters to perhaps a tenth of that, with its best cadres now either dead or “choosing personal safety over operational efficiency.” While Zawahiri is dangerous, Gerges says, he lacks Bin Laden’s charisma or strategic sense. The best way to undermine what remains of the organization would be to withdraw American troops from Muslim countries.

Far bleaker is Michael Scheuer's assessment in “The Zawahiri Era,” published in the new issue of The National Interest, a conservative policy journal best known as Francis Fukuyama's venue for proclaiming “The End of History” in 1989. Scheuer is a former CIA analyst and the author of Osama Bin Laden (Oxford University Press, 2011). While noting Zawahiri’s “potentially debilitating personality traits and leadership quirks,” Scheuer also calls him “a rational, prudent, brave, dedicated and media-savvy leader,” fully capable of rebuilding the movement.

But that assumes Zawahiri can attract new fighters. Whatever recruitment spike Al Qaeda enjoyed after 9/11 has long since exhausted itself, to judge by the excerpt from The Missing Martyrs by Charles Kurzman appearing in the September-October issue of Foreign Policy (which is something like Foreign Affairs' younger, better-dressed sibling). Kurzman is a professor of sociology at the University of North Carolina at Chapel Hill. As with Gerges and Scheuer, his book is from Oxford University Press. “By my calculation,” he writes, “global Islamist terrorists have managed to recruit fewer than 1 in 15,000 Muslims over the past quarter century and fewer than 1 in 100,000 Muslims since 9/11.” (The article is available to subscribers.)

Mohammad Atta and his associates were not riding the wave of the future, then: “There aren’t very many Islamist terrorists,” Kurzman says, “and most are incompetent. They fight each other as much they fight anybody else, and they fight their potential state sponsors most of all. They are outlaws on the run in almost every country in the world, and their bases have been reduced to ever-wilder patches of remote territory, where they have to limit their training activities to avoid satellite surveillance.”

So much of the discussion leading up to this anniversary looks to the present or the future – as if 9/11 were not in the past, but rather something that still abides. As Jurgen Habermas said in an interview a few years ago, 9/11 may have been the first event to be experienced, as it was happening, on a really global scale. That may have something to do with the way it seems to have irradiated everything, and to linger in the air.

In their article “The September 11 Digital Archive,” appearing in the fall issue of Radical History Review, Stephen Brier and Joshua Brown seem to echo the philosopher’s point. “One difference demarcating September 11, 2001, from previous epochal historical moments,” they write, “was its status as the first truly digital event of world historical importance: a significant part of the historical record – from email to photography to audio to video – was expressed, captured, disseminated, or viewed in (or converted to) digital forms and formats.”

To preserve these traces for the future was an undertaking both urgent and vast. Within two months of the attacks, the American Social History Project at the City University of New York Graduate Center and the Center for History and New Media at George Mason University began working to gather and store such material, as well as thousands of recollections of the day submitted by the public. (Brier was a co-founder of the ASHP and Brown is currently its executive director.)

While still under development – adding adequate metadata to the files, for example – the September 11 Digital Archive is available online and should be taken over by the Library of Congress in 2013. It should not be confused with the LoC’s September 11, 2001, Web Archive, which has screen shots of websites around the world that were taken, according to the library’s description, between September 11 and December 1 of 2001. Unfortunately the collection is rather primitive and unreliable. A number of items are actually from late 2002 and have no bearing on 9/11; some entries in the register turn out to have no corresponding webpage.

No doubt a much better digital archive for 9/11 is on an NSA server somewhere. It may be some while before historians get to see it – maybe by the centennial? In the meantime, the rest of RHR's special issue "Historicizing 9/11" can be downloaded here.

 

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Recovery in Political Science

Smart Title: 

Discipline shows signs of an improved job market, with strong gains in assistant professor positions. Plus a new analysis of tenure decisions.

Why I Study Europe

As a black woman in political science, Terri E. Givens is constantly asked why her research expertise isn't what people expect.

Living on the Edge

Normally I would be averse to going public with the internal affairs of the Flat Earth Society. But this is not the time for silence or misguided diplomacy. The failure of our leadership to throw the Society's full support behind the Academic Bill of Rights is little short of scandalous.

It is time to put an end to the constant stream of indoctrination in America's college classrooms on the part of "scholars" only too willing to serve the interests of the globe-manufacturing lobby. Students should be given a chance to use their own rationality and powers of observation. Remember, the so-called "theory" of spherical-earthism is just that -- a theory. (I mean, come on! It's just a matter of common sense. The world can't be round. The people in Australia would fall off.)

At the same time, the Society has nearly liquidated its treasury in placing a bulk order for a new book by Thomas Friedman, the New York Times foreign affairs columnist, called The World is Flat. The cover is, to be sure, very impressive. It portrays two ships and a small boat sailing dangerously close to the edge of the earth. However, I am now reading the book, and am sorry to report it is not nearly as good as we all had hoped.

Friedman argues that the rapid spread of high-speed digital communication has created conditions in which skilled labor in now-impoverished countries can be integrated into a new economic order that will end extreme disparities in wealth and development. The world will be less uneven, and in that sense more "flat."

It's a book about globalization, in other words. Which makes the title (not to mention the artwork, which has given me nightmares) very sneaky indeed.

To be honest, I'm not entirely sure that the Flat Earth Society is still active. (It has a Web page,  though that doesn't mean much.) But a recent reading of Martin Gardner's classic Fads and Fallacies in the Name of Science is a reminder that it was in 1905 that the Rev. Wilbur Glenn Voliva became General Overseer of Zion, Illinois -- a town in which church and state were, at the time, pretty much identical. Voliva ministered to the Christ Community Church and enforced strict blue laws, while also carrying on the scientific research necessary to prove that (as Gardner puts it) "the earth is shaped like a flapjack, with the North Pole at the center and the South Pole distributed around the circumference."

He offered a reward of $5,000 to anyone who could prove otherwise, and never had to part with what any of his money. It is good to know that, 100 years later, Voliva's scholarly efforts  may yet win a hearing in the American academic life -- thanks to the tireless efforts of David Horowitz.

As for Thomas Friedman .... well, his version of flat-earth doctrine is bound to have an impact on academe, even if no professor ever opens his latest volume. The people flying in business class read Friedman's books -- and that includes plenty of university administrators, those acting CEOs of the knowledge economy.

Nor will it hurt that The World is Flat is, in effect, one long plea to corporations, government officials, and any other policy-makers who might be reading to invest in higher education as the nation's top priority for the future. In a world where more and more jobs can be done more cheaply, in new places, people need constantly to update, refine, or change entirely their toolkit of knowledge and skills.  

Friedman has a knack for harvesting the information, opinions, and gut instincts of some of the most powerful people in the world. He boils it all down into some catchy slogans, and voila! You've got a bouillon cube of the conventional wisdom for the next two or three years.
 
He is bullish on the long-term benefits of the global market -- with that congenital optimism tempered (occasionally, and just a little) by the experience of having served as a Middle East correspondent. And he shows a faith in the power of corporations to become good global citizens that is either inspiring or willfully obtuse -- depending on whether or not you are annoyed by the fact that The World is Flat contains exactly zero interviews with labor leaders.

It is his instinct towards globalization boosterism that gives the edge, so to speak, to Friedman's thesis on what he calls "flatism." In short, his argument is that the technological infrastructure now exists to make it economically rational for more and more kinds of business to be conducted in a way that is dispersed over networks that span the entire world. Outsourcing no longer means shifting manufacturing offshore -- or even having the less-skilled kinds of service-sector jobs (data keypunching, for example) done in another country.

Work requiring more sophisticated cognitive skills -- bookkeeping, computer programming, or the analysis of medical test results, for example -- can be done in India or China at much less expense. Jobs thus become more mobile than the people who do them.

Friedman's main point is that this is not a trend that will take shape at some point in the future. It is happening right now; the trend will not reverse. And the American political parties and the cable news programs are not telling the public what is happening. They are, as Friedman puts it, "actively working to make people stupid."

Instead, "companies should be encouraged, with government subsidies or tax incentives, to offer as wide an array as possible of in-house learning opportunities," thereby "widening the skill base of their own workforce and fulfilling a moral obligation to workers whose jobs are outsourced to see to it that they leave more employable than they came."

Friedman also favors "an immigration policy that gives a five-year work visa to any foreign student who completes a Ph.D. at an accredited American university in any subject. I don't care if it's Greek mythology or mathematics. If we cream off the first-round intellectual draft choices from around the world, it will always end up a net plus for America."

I n a way, Friedman has come to his own version of some of the ideas that Manuel Castells developed some years ago in the three large volumes of The Information Age. There, the sociologist worked out an account of how the "space of flows" between parts of a dispersed economic network would transform the "space of places" (that is, the real-world geography) in which people dwell.

As with Friedman's notion of "flatism," the increased productivity and ceaseless disruption of network society were basic to the picture that Castells drew. But he also stressed something that Friedman -- with his abiding cheerfulness -- tends to downplay: Skills, knowledge, and wealth accumulate at the dispersed nodes of an economic network, but some parts of the world fall outside the network more or less entirely.

Most of Africa, for example. Last year, a study found that 96 percent of the continent's population had no access to telecommunications of any kind. Given the unavailability of drinking water and medical supplies, that is probably the least of anyone's worries. But even with the recent increase in wireless access in Africa -- thereby potentially getting around the scarcity and unreliability of more traditional landline telecommunication -- it is unlikely that part of the world will be "flattening" anytime soon. (Some might see the glass as 96 percent empty, but I suppose someone encouraged by Friedman's book would consider it 4 percent full.)

Meanwhile, it is difficult to feel much optimism about Friedman's proposal for beefing up the resources for increasing the educational opportunities of the American workforce. At least for now, the public discourse on higher education is caught in a particularly narrow and regressive set of undercurrents.

It's possible to joke about how the Rev. Voliva's scholarship in flat-earth studies might finally start getting their due. But matters are serious when scientists are forced to resort to references to Lysenkoism to describe the government's science policy. And higher education itself is the focus of a barrage of ideologues who seem to have confused The Authoritarian Personality with a manual for self-improvement.

It would be good to think that the national agenda could change -- that the notion of "flatism," whatever its limitations, might help spur increased public commitment to continuing education. But then, as Friedman also says, certain politicians and media outlets are "actively working to make people stupid." With that part, at least, he's being realistic.  

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scot McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Write On

Later this year, I'll give a paper at the annual convention of the American Political Science Association. For someone who is not a political scientist, this is a bizarre prospect -- like one of those dreams in which you must take a final exam in a course you’ve never actually taken. My topic involves tracing one strand of neoconservative ideology back to its source in a far-flung mutation of Marxist theory. I’ve been doing the research for about 20 years, off and on, without ever quite supposing that it would culminate in a presentation in front of a bunch of professors.

Then again, the matter is sufficiently esoteric that "bunch" may not be the exact word. Chances are there will be more than enough chairs.

In any case, a mass of old books and photocopies are now stacked up, to an unstable height, on my desk. And on top of the pile there is a notebook. The reading notes, the rough outline, the first draft or two ... all will be written there, in longhand.

My friends and colleagues are occasionally nonplussed to learn that someone trying to make a living as a writer actually spends the better part of his workday with pen in hand. (It’s probably comparable to finding out that your doctor grows blood-sucking leeches in the basement.) Like an interest in the fine distinctions made by the ancient Trotskyists, my writing habits are idiosyncratic, anachronistic, and more or less impossible to justify in terms making any sense given the state of 21st century American culture.

Yet the rut is now too deep to crawl out of it. I have my reasons. Or perhaps, to be more precise, my rationalizations. Not that they persuade anybody else, of course. It’s particularly awkward when an editor asks for a progress report. There is a certain uncomfortable silence when I say, "Well, the notebook is almost full...." 

Nowadays, the word "text" connotes an artifact that is "always already" digitized -- something to be fed into a streamlined apparatus for circulating information. But the word itself comes from the Latin root texere, to weave, as in "textile."

In my own experience, though, writing is not so much the crafting of paragraphs as it is a matter of laboriously unknotting the thread of any given idea. And the only way to do that is by hand. The process is messy and not terribly efficient.

Writing this column twice a week, for example, is a matter of juggling two legal pads of different sizes, plus anywhere from one to three notebooks. It is easy to detect which parts were written with a cup of coffee in one hand: The sentences are long, the handwriting spiky, the parentheses nestled one inside the next. By its penultimate phase, the draft is a puzzling array of arrows, boxes, Venn diagrams, and Roman numerals. (Also, as the case may require, whatever lower-case letters of the Greek alphabet I can still remember.)

The effect resembles the flow chart for a primitive computer program to be run on a wheezy old tube-driven UNIVAC.

Only as the deadline approaches is anything actually typed up, in a kind of spastic marathon. By that point, a certain passage from Walter Benjamin always comes to mind: "The work is a death mask of its conception."
 
Actually, with hindsight, it’s easy to see that Benjamin got me started on this erratic and circuitous course. In a collection of essays and fragments called One Way Street, he offers a set of aphorisms on writing, including the one just quoted. (First published in 1926, it is now available in the first of a four-volume edition of his work in English published by Harvard University Press.)

"Let no thought pass incognito," Benjamin insisted, "and keep your notebook as strictly as the authorities keep their register of aliens." (A line that became more poignant after the Nazis came to power, forcing Benjamin to spend the rest of his life in exile.)

But one passage in particular made a huge impression on me. "Avoid haphazard writing materials," admonished Benjamin. "A pedantic adherence to certain papers, pens, inks is beneficial. No luxury, but an abundance of these utensils is indispensible."

As if to clinch it, there is an interview that Roland Barthes gave in 1973 that seems to ratify Benjamin’s point. Under the title "An Almost Obsessive Relation to Writing Instruments," it was  reprinted posthumously in a collection called The Grain of the Voice: Interviews 1962-1980, published by the University of California Press.  

In a gesture very typical of his structuralist penchant for creating categorical distinctions, Barthes notes that his own writing process goes through two stages: "First comes the moment when desire is invested in a graphic impulse," said Barthes. It was a phase of copying down "certain passages, moments, even words which have the power to move me," and of working out "the rhythm of a sentence" that gives shape to his own ideas. Only much later can the text be "prepared for the anonymous and collective consumption of others through transformation of into a typographical object" – a moment, according to Barthes, when the writing "is already beginning its commercialization."

Clearly the important phase is the one in which "desire is invested in a graphic impulse." And for that, you need the right tools. "I often switch from one pen to another just for the pleasure of it," Barthes told the interviewer. "As soon as I see a new one, I start craving it. I cannot keep myself from buying them."

The one exception was the Bic, which Barthes found disgusting: "I would even say, a bit nastily, that there is a 'Bic style,' which is really just for churning out copy...."

So the penchant for haunting stationery stores (and otherwise indulging a fetish for writing supplies) has the endorsement of  distinguished authorities. But my efficiency-cramping distaste for the computer keyboard is somewhat more difficult to rationalize.

Walter Benjamin and Roland Barthes died long before word processors were available, of course. But a good excuse not to write first drafts that way comes from the poet Ted Hughes, in a passage quoted by Alice W. Flaherty in her fascinating book The Midnight Disease: The Drive to Write, Writer’s Block, and the Creative Brain.

In an account of judging in a contest for children’s writing, Hughes recalled that the entries once tended to be two or three pages long. "But in the early 1980s," he said, "we suddenly began to get seventy and eighty page works. These were usually space fiction, always very inventive and always extraordinarily fluent – a definite impression of a command of words and prose, but without exception strangely boring...."

In each case, the kid had composed the miniature magnum opus on a word processor.

"What’s happening," according to Hughes, "is that as the actual tools for getting words onto the page became more flexible and externalized, the writer [could] get down almost every thought or extension of thought. That ought to be an advantage. But in fact, in all these cases, it just extends everything slightly too much. Every sentence is too long. Everything is taken a bit too far, too attenuated."

Which sounds, come to think of it, somewhat like what Barthes called "Bic style." And quite a bit like the output of various academic presses it would be discrete leave unnamed.

Not that writers had to wait for the advent of the word processor to produce work that was (in Hughes’s terms) "extraordinarily fluent" yet "strangely boring."

Indeed, in the mid-1920s, Walter Benjamin gave practical tips to scholars who wanted both to impress their readers by clobbering them into a stupor. In a satiric chapter of One Way Street called "Principles of the Weighty Tome, or How to Write Fat Books," he laid out the principles that many still follow today.

"The whole composition must be permeated with a protracted and wordy exposition of the initial plan," Benjaim wrote. "Conceptual distinctions laboriously arrived at in the text are to be obliterated again in the relevant notes....Everything that is known a priori about an object is to be consolidated by an abundance of examples.... Numerous opponents who all share the same argument should each be refuted individually."

Benjamin himself never got an academic position, of course. Even so, good advice is timeless.  

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Throat Culture

For the past few days, I've been waiting for a review copy of Bob Woodward's book The Secret Man: The Story of Watergate's Deep Throat to arrive from Simon and Schuster. So there has been some time to contemplate the way that (no longer quite so) mysterious figure has been "inscribed" ina "double register" of "the historical imaginary," as the cult-stud lingo has it. (Sure hope there's a chance to use "imbricated discourse" soon. Man, that would be sweet.)

Putting it in slightly more commonplace terms: Two versions of Deep Throat have taken shape in the past 30 years or so. They correspond to two different ways of experiencing the odd, complex relationship between media and historical memory.

On the one hand, there was Deep Throat as a participant in a real historical event -- making the question of his motivation an important factor in making sense of what happened. It was even, perhaps, the key to understanding the "deep politics" of Watergate, the hidden forces behind Richard Nixon's fall. The element of lasting secrecy made it all kind of blurry, but in a fascinating way, like some especially suggestive Rorschach blot.

On the other hand, there was Deep Throat as pure icon -- a reference you could recognize (sort of) even without possessing any clear sense of his role in Watergate. It started out with Hal Holbrook's performance in All the President's Men -- which, in turn, was echoed by "the cigarette-smoking man" on "The X Files," as well as the mysterious source of insider information about the Springfield Republican Party on "The Simpsons." And so Deep Throat (whose pseudonym was itself originally amovie title) becomes a mediatic signifier unmoored to any historical signified. (An allusion to an allusion to a secret thus forgotten.)

Different as they might be, these two versions of Deep Throat aren't mutually exclusive. The discourses can indeed become imbricated ( yes!), as in the memorable film Dick, which reveals Deep Throat as a pair of idealistic schoolgirls who guide the cluelessly bumbling Woodward and Bernstein through the mysteries of the Nixon White House.

There is something wonderful about this silly premise: In rewriting the history of Watergate, Dick follows the actual events, yet somehow neutralizes their dire logic by just the slightest shift ofemphasis. The deepest secret of an agonizing national crisis turns out to be something absurd.

That perspective is either comically subversive or deeply cynical. Either way, it's been less anticlimactic, somehow, than the revelation of Deep Throat's real identity as the former FBI official Mark Felt. So much for the more elaborate theories about Watergate - that it was, for example, a "silent coup" by a hard-right anticommunist faction of the U.S. military, upset by the administration's dealings with the Soviets and the Chinese. And Deep Throat's role as emblem of noir-ish intrigue may never recover from the impact of the recent, brightly lit video footage of Mark Felt -- half-dazed, half mugging for the camera.

And there have been other disappointments. This week, I had an interesting exchange by e-mail with Bill Gaines, a professor of journalism at the University of Illinois at Urbana-Champaign and two-time winner of the Pulitzer, not counting his two other times as finalist. His part in the DeepThroat saga came late in the story, and it's caused him a certain amount of grief.

But it was also -- this seems to me obvious -- quite honorable. If anything, it is even more worthy of note now that Bob Woodward is telling his side of the story. (While Carl Bernstein also has a chapter in the book, it was Woodward who had the connection with Felt.)

In 1999, Gaines and his students began an investigation designed to determine the identity of Deep Throat. The project lasted four years. It involved sifting through thousands of pages of primary documents and reading acres of Watergate memoir and analysis -- as well as comparing the original articles by Woodward and Bernstein from The Washington Post to the narrative they provided in their book All the President's Men. Gaines also tracked down earlier versions of the manuscript for that volume -- drafted before Woodward decided to reveal that he had a privileged source of inside information.

Gaines and his students compiled a database they used to determine which of the likely candidates would have actually been in a position to leak the information that Deep Throat provided. In April 2003, they held a press conference at the Watergate complex in Washington, DC, where they revealed ... the wrong guy.

After a period of thinking that Deep Throat must have been Patrick Buchanan (once a speechwriter for Nixon), the researchers concluded that it had actually been Fred Fielding, an attorney who had worked as assistant to John Dean. The original report from the project making the case for Fielding is still available online -- now updated with a text from Gaines saying, "We were wrong."

The aftermath of Felt's revelation, in late May, was predictably unpleasant for Gaines. There were hundreds of e-mail messages, and his phone rang off the hook. "Some snickered as if we had run the wrong way with the football," he told me.

But he added, "My students were extremely loyal and have told anyone who will listen that they were thrilled with being a part of this project even though it failed." Some of those who worked on the project came around to help Gaines with the deluge of correspondence, and otherwise lend moral support.

As mistaken deductions go, the argument offered by Gaines and his students two years ago is pretty rigorous. Its one major error seems to have come at an early stage, with the assumption that Woodward's account of Deep Throat was as exact as discretion would allow. That was in keeping with Woodward's own statements, over the years. "It's okay to leave things out to protect the identity of a source," he told the San Francisco Chronicle in 2002, "but to add something affirmative that isn't true is to publish something you know to be an inaccuracy. I don't believe that's ethical for a reporter."

The problem is that the original account of Deep Throat doesn't line up quite perfectly with what is known about Mark Felt. Some of the discrepancies are small, but puzzling even so. Deep Throat is a chain smoker, while Felt claimed to have given up the demon weed in 1943. "The idea that Felt only smokes in the garage [during his secretive rendezvous with Woodward] is a little hard to swallow," says Gaines. "I cannot picture him buying a pack and throwing the rest away for the drama it will provide." By contrast, Fielding was a smoker.

More substantive, perhaps, are questions about what Deep Throat knew and how he knew it. Gaines and his students noted that statements attributed to Deep Throat in All the President's Men were credited to a White House source in the original newspaper articles by Woodward and Bernstein. (Felt was second in command at the FBI, not someone working directly for the White House, as was Fielding.)

Deep Throat provided authoritative information gleaned from listening to Nixon's secret recordings during a meeting in November 1973. That was several months after Felt left the FBI. And to complicate things still more, no one from the FBI had been at the meeting where the recordings were played.

According to Gaines, that means Felt could only have learned about the contents of the recordings at third hand, at best. Felt was, as Gaines put it in an e-mail note, ""so far removed that his comments to Woodward would have to be considered hearsay, and not the kind of thing a reporter could write for fact by quoting an anonymous source."

When I ask Gaines if there is anything he hopes to learn from Bob Woodward's new book, he mentions hoping for some insight into one of the more memorable descriptions of the secret source -- the one about how Deep Throat "knew too much literature too well." In any case, Gaines make a strong argument that Woodward himself took a certain amount of literary license in transforming Felt into Deep Throat.

"We know from our copy of an earlier manuscript that Woodward changed some direct quotes attributed to Throat," he notes. "They were not major changes, but enough to tell us that he was loose with the quotes. There is information attributed to Throat that Felt would not have had, or that doesnot agree with what we found in FBI files."

As the saying has it, journalists write a first draft of history. One of the ethical questions involves trying to figure out just how much discretion they get in polishing the manuscript. Gaines seems careful not to say anything too forceful on this score -- though he does make clear that he isn't charging Woodward with creating a composite character.

That has long been one of the suspicions about Deep Throat. Even the new revelation hasn't quite dispelled it. Just after Felt went public with his announcement, Jon Wiener, a professor of history at the University of California at Irvine, reviewed some of the grounds for thinking that "several people who provided key information ... were turned into a composite figure for dramatic purposes" by Woodward and Bernstein. (You can find more of Wiener's comments here, at the very end of the article.)

For his part, Gaines says that the Deep Throat investigation isn't quite closed -- although he wishes it were. "I have always wanted to move on to something more important for the class project," he told me, "but the students and the media have caused us to keep going back to the Throat story."

Maybe now they should look into the mystery surrounding Deep Throat's most famous line: his memorable injunction to Woodward, "Follow the money."

It appears in the movie version of All the President's Men, though it can't be found in the book. When asked about it in an interview some years ago, Woodward guessed that it was an embellishment by William Goldman, the screenwriter. But Goldman has insisted that he got the line from Woodward.

Now it's part of the national mythology. But it may never have actually happened. Sometimes I wish the discourses would stop imbricating long enough to get this kind of thing sorted out.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Reading Left to Right

Once upon a time -- back in the days of dial-up and of press conferences devoted to the presidential libido -- there was a phenomenon known as the "web log." It was like a blog, only different. A web log consisted almost entirely of links to pages that the 'logger had recently visited online. There might also be a brief description of the site, or an evaluative remark. But the commentary was quick, not discursive; and it was secondary to the link. The product resembled an itinerary or a scrapbook more than it did a diary or an op-ed page.

So when Political Theory Daily Review started in January 2003, it already looked a little bit old-fashioned, blogospherically speaking. It was a log, plain and simple. There were three new links each day. The first was to a newspaper or magazine article about some current event. The second tended to go to a debate or polemical article. And the third (always the wild card, the one it was most interesting to see) would be academic: a link to a scholarly article in an online journal, or a conference site, or perhaps the uploaded draft of a paper in PDF.

In the intervening years, the site has grown wildly -- at least in size, if not in reputation. (Chances are that more bloggers read Political Theory than ever link to it.)  The same three departments exist, but often with a dozen or more links in each. By now, clearly, the Review must be a team effort. The sheer volume of material logged each day suggests it is run by a collective of gnomes who tirelessly scour the Web for eruditia.

But in fact, it is all the work of one person, Alfredo Perez, who keeps a pretty low profile, even on his own site. I got in touch with Perez to find out who he is, and how he puts the Review together. (I also wondered if he ever got much sleep, but forgot to ask that part.) Here, in any case, is the gist of our e-mail discussion, presented with his permission.

Alfredo Perez is 34 years old and originally from Puerto Rico. After going to college in the United States, he went back to the island to work in the government for a few years, then headed to New York in 1996. He ended up at the New School, where he is now pursuing a dissertation on political theory. He lists his research interests as "normative political theory, cosmopolitanism and sovereignty, theories of human nature, and political economy."

Now, alembicating all of that down to a manageable dissertation is not so easy. And it sounds like Political Theory Daily Review has had a complicating effect on the whole process. "Writing a dissertation is an exercise in becoming an expert in one small piece of scholarly real estate," he says. "It really hasn't helped in that way."

But the Review has also had its educational benefits for Perez. It has encouraged him to keep up with fields that are now in the news: "the debate regarding constitutional interpretation, the arguments about American foreign policy and its impact around the world, and the space for religion in the public sphere...." He says he "probably would have been much less informed about [these areas] without having to keep up the site."

Over the year or so that I've come to rely on the Review as gateway to new material online, the most striking thing has been Perez's mix of sources. On the one hand, he covers extremely topical material -- "ripped from today's headlines," with quite a few of those headlines being from the English-language editions of foreign newspapers and magazines.

On the other hand, some of the sites to which Perez links are exotic, esoteric, or just downright weird. I'm glad to hear about the debate over liberalism in a Slovakian journal called Kritika & Kontext -- but could probably have lived without seeing the United States Christian Flag. It is a relief, though, to learn that the latter Web site's sponsors "are not trying to overthrow the government or force anyone to be a Christian." Thank heaven for small favors.

How does Perez keep up with all  this stuff? What are his criteria for linking? Do readers send him tips?

To take the last question first: No, for the most part, they don't. Evidently he just has one wicked set of bookmarks.

"I try to link to things that are interesting to me or to anyone trying to keep up with current events," says Perez, "not just political theory.... I don't link to technical papers on, say, economics, but if I see an interview with Gary Becker or an article on Amartya Sen, I don't think twice about linking to that. Sometimes I link to articles on Theory, essays by literary critics, or events in the world of literature." He also has an interest in the natural sciences -- biology, in particular -- so he links to things he's following in Scientific American and other publications.

Perez doesn't link to blogs. That way, madness lies. "It would be too much work to consider linking to the blogosphere," he says."

He places a special emphasis on pointing readers to "articles that are sure -- or have the potential -- to become part of what's debated in the public sphere." That includes things like op-eds in The New York Times, articles on public policy in The American Prospect, and essays from the socialist journal Dissent -- "material that I think should be a part of the 'required reading' for anyone who wants to stay on top of the news and public debates."

His default list of required readings shows a certain tilt to the left. But he also links to material far removed from his own politics -- publications such as Reason , First Things , Policy Review , and "The Occidental Quarterly." Actually, it was Perez's site that first introduced me to the latter periodical, which describes itself as a "journal of Western thought and opinion." Its editors are keen on eugenics, stricter immigration laws, and the European cultural tradition (in particular the German contribution thereto).

"I think it obvious," says Perez, "that anyone interested in public debates about more philosophical matters has to be familiar with those on 'the other side.' I think it's just plain smart to do so. Reading counterarguments to your position can often be more helpful than readings that just confirm your own point of view." He says he makes no claim to be "fair and balanced," but also "doesn't want to alienate visitors who are on the right. I want them coming back!"

Any editorializing at Political Theory Daily Review tends to be implicit, rather than full-throated. It may be that lack of a sharp ideological edge, as much as the sheer number of links in the course of a week, that creates the impression that the site is the work of a committee.

Perez admits that he's "not very comfortable about publishing opinions willy-nilly like many people are when writing on their blogs. In fact, I am part of a group blog, Political Arguments, but I hardly ever post there." It's not that he lacks a viewpoint, or is shy about arguing politics and philosophy with his friends and family.

"I'm pretty sure I could defend those views well enough," he told me. "I guess it's my way of being a bit careful about the whole process. People in academia cannot be timid about their own views, of course, especially political theorists with regards to politics. But it's different when discussing day-to-day events as soon as they happen."

The line between public intellectual and pompous gasbag is, to be sure, a slender one; and it runs down a slippery slope. Perez's caution is understandable. "I don't think I have to mention any specific names in academia as examples," he says, "in order to make my point here."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Democracy at Risk

This may be a minority opinion, but I’ll stick by it: Nothing clarified the course of American politics quite like the 1996 election -- and in particular the presidential debate. First there was the stirring rhetoric of Kang: "The politics of failure have failed! We have to make them work again." Then came the statesmanlike reply of Kodos: "I am looking forward to an orderly election tomorrow that will eliminate the need for a violent bloodbath."

For any old-fashioned, television-averse academic readers out there, it might be good to explain that Kang and Kodos are two drooling, mono-eyeballed, multi-tentacled creatures from another galaxy who occasionally show up on "The Simpsons." In 1996, they kidnapped Clinton and Dole and took their places on the campaign trail. Nobody really noticed.

Kang’s catchy slogan about the politics of failure came to mind while reading Democracy at Risk: How Political Choices Undermine Citizen Participation, and What We Can Do About It, a new book published by the Brookings Institution Press. The title page lists 19 authors, all of them distinguished political scientists. (For a list, look here. ) They are scrupulously non-partisan -- almost transcendentally so. "Favoring a party has never been our aim," they write, "nor have we self-consciously striven for partisan balance; instead, we recommend what we think is best for the nation." They write with such sober, intelligent concern about the state of the republic that one cannot help feeling a little guilty for comparing them to extraterrestrials hell-bent on earth’s domination. I’m sure that most of them have no such intent.

But it sometimes happens that sober, intelligent books that straddle the divide between scholarship and public-spirited worrying can be a source of frustration. Democracy at Risk is a case in point. Not because it is hysterical or outlandish. Far from it. Rather, it's so level-headed as to be somewhat anodyne. Which is strange, because parts of the report are fairly troubling to think about. Decorum is a fine thing, but not always the most suitable. Sometimes yelling is appropriate.

The team putting Democracy at Risk together started out, three years ago, as a task force of the American Political Science Association. It was charged with "bring[ing] the insights of political science to bear on the problem of civic engagement."

The latter phrase refers the American public's steady, long-term trend towards increasing apathy, ignorance, and passivity in regard to all things political.

Increased voter turnout in 2004 was, seemingly, an exception to the trend. It was widely regarded as the most important presidential race in recent memory. Between 59 and 60 percent of those registered to vote actually did. Democracy at Risk puts that in context by noting that the level of participation "was about the same as in 1956, when an incumbent president handily and predictably defeated the same challenger he had faced four years earlier." And in spite of a massive get-out-the-vote effort "in which interest groups alone spent more than $350 million," the turnout "was only 5 percentage points higher than in 2000."

Calling this a case of "the problem of civic engagement" is more neutral-sounding than references to apathy, ignorance, and passivity. It has the advantage of avoiding the censorious overtone of those words. The authors cite Plato and Aristotle as the founding fathers of their discipline. But perhaps it would have been more fitting to take their inspiration from Confucius (the forerunner of an astute, brass-tacks kind of poli sci). For as Confucius put it, you can begin to rectify the order of things only by giving them their proper names.

Chart after chart in Democracy at Risk shows the downward curve of almost any index of participation you’d care to name. "Between 1974 and 1994," the authors point out, "engagement in twelve key political activities, such as writing letters to the editor, participating in rallies and demonstrations, and volunteering in campaigns, fell significantly.... From the mid-1970s to the present, the number of adolescents who say they can see themselves working on a political campaign has dropped by about half." Some measures indicated that "college graduates nowadays know as much about politics as the average high school senior did fifty years ago."

The situation is one of mutually reinforcing disincentives to anything resembling meaningful citizenship -- with a corrosive effect on the legitimacy of established institutions. "When rates of turnout are low," say the political scientists, "those who do turn up tend to be disproportionately ideological zealots." That heightens the tendency towards chest-thumping within the political class: "As recently as the late 1970s and early 1980s, there were many members of the House and Senate who were 'cross-pressured' and prepared to work across party lines to solve national problems. There are now far fewer such legislators."

Which, in turn, makes for gridlock and grandstanding. Which then inspires revulsion (and reduced voter turnout) across large sectors of the "I’m-a-pragmatic-moderate" electorate. And so, come election time, mostly the energized base shows up at the polls. Repeat ad nauseam, ad infinitum.

That is a quintessentially centrist diagnosis, of course. "Our politics has become far more rancorous," the report states, "and this not only makes it harder to legislate but also turns off moderate voters, while arousing the passions of those at ideological extremes (precisely the opposite of what our political institutions should do as a general matter)."

Well, I have my doubts about that "should." Justice is not usually on the side of genial indifference -- which, after all, is what the paralyzed response to "ideological extremes" amounts to, often enough. The Abolitionists and Suffragettes were on the fringe. Aleksandr Solzhenitsyn was, and is, in sundry regards, more than a bit of a wingnut. Better the truths of their ideological extremism than the status quo ante consensus ranged against them.

Even so, there is something worrying about the self-disenfranchisement of the professedly moderate citizens: Namely, the danger of a crisis of legitimacy, eroding whatever minimal level of trust in the institutions of governance is required to avoid (as Kodos puts it) a violent bloodbath.

Not that the authors put it quite like that. But come on -- let’s not pretend it isn't something to consider. Remember the angry crowds in Florida, five years ago. Think of the situation in France right now, with its wireless rioters. It is an article of faith among some people on the left that non-voting by the non-elite is a sign of their superior insight into the nature of the system. (I speak as someone who used to finger that particular set of rosary beads, from time to time.) But chances are an American flash-mob uprising would be somewhat less inspiring than the storming of the Bastille.

So what do the authors of Democracy at Risk propose? All sorts of things -- most of them modest reforms, aimed at reducing various barriers to civic participation that have developed over the years.

Some are simple enough that they might just happen. For example, they suggest mailing polling place information to registered voters shortly before Election Day. Others feel only slightly more realistic than proposing to repeal the law of gravity: "Vigilant enforcement of fair housing laws already on the books would help to ensure that all Americans have the same opportunity to live in desirable neighborhoods. Increasing racial and social integration would, in turn, inject diversity and pluralism into experiences of local engagement."

There should be "a greater emphasis in schools on civic education that emphasizes civic duty and empowerment" so that young citizens-in-training will "learn not only how to think about political forces, causes, and effects but also about what they can do and the conditions under which they can make a difference."

It is difficult to argue with this suggestion. For that matter, it is difficult even to say, in any concrete sense, what it actually means. But unless either Jon Stewart or Playstation is involved, it’s probably a waste of everyone's time.

The authors’ willingness to brainstorm is commendable, though the list of proposals can leave you with more worry and bafflement. It's not just the proposals are sometimes a bit vague, or even slightly comic. (Will it really do much good to "encourage recognition of a young person’s 'first vote' as a significant rite of passage"?)

The limits on civic participation are, in part, a matter of the de facto disenfranchisement of people with limited economic and educational opportunities. Addressing that means taking political action. Which is, in turn, a matter of increased civic participation on the part of people who aren’t being served by the present arrangement. See the problem? No doubt some of the authors did, too, without quite solving it. Then again, they’re political scientists, not political magicians.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Piled Higher and Deeper

Rick Perlstein, a friend from the days of Lingua Franca, is now working on a book about Richard Nixon. Last year, he published a series of in-depth articles about the Republican Party and the American conservative movement. (Those are not quite the same thing, though that distinction only becomes salient from time to time.) In short, Perlstein has had occasion to think about honesty and dissimulation -- and about the broad, swampy territory in between, where politicians finesse the difference. As do artists and used-car salesmen....

It’s the job of historians to map that territory. But philosophers wander there, too. “What is truth?” as Nietzsche once asked. “A mobile army of metaphors, metonymies, anthropomorphisms. Truths are illusions of which one has forgotten that they are illusions.” Kind of a Cheneyo-Rumsfeldian ring to that thought. It comes from an essay called “On Truth and Lie in an Extra-Moral Sense,” which does, too, come to think of it.  

So anyway, about a week ago, Rick pointed out a recent discussion of how the Bush Administration is dealing with critics who accuse it of fudging the intelligence that suggested Saddam Hussein had weapons of mass destruction. The link went to a comment by Joshua Micah Marshall, who is a liberal Democrat of the more temperate sort, not prone to hyperventilation. 

“Garden variety lying is knowing it’s Y and saying it’s X,” he wrote, giving Lyndon Johnson on the Gulf of Tonkin as an example. The present executive branch, he continued, shows “a much deeper indifference to factual information in itself.”

Rick posed an interesting question: “Isn't Josh Marshall here describing as the Administration's methodology exactly what that Princeton philosophy prof defines as ‘bullshit’?” That prof being, of course, Harry Frankfurt, whose short and best-selling treatise On Bullshit will probably cover everyone’s Christmas bonus at Princeton University Press this year. 

In February, The New York Times beat us by a day or so with its article on the book, which daintily avoided giving its title. But "Intellectual Affairs" first took a close look, not just at Frankfurt’s text -- noting that it remained essentially unchanged since its original publication as a scholarly paper in the 1980s -- but at the philosophical critique of it presented in G.A. Cohen’s essay “Deeper into Bullshit.” 

Since then, the call for papers for another volume of meditations on the theme of bull has appeared. Truly, we are living in a golden age.

The gist of Frankfurt’s argument, as you may recall, is that pitching BS is a very different form of activity from merely telling a lie. And Marshall’s comments do somewhat echo the philosopher’s point. Frankfurt would agree that “garden variety lying” is saying one thing when you know another to be true. The liar operates within a domain that acknowledges the difference between accuracy and untruth. The bullshitter, in Frankfurt’s analysis, does not. In a sense, then, the other feature of Marshall’s statement would seem to fit. Bullshit involves something like “indifference to factual information in itself.”

So does it follow, then, that in characterizing the Bush team’s state of mind three years ago, during the run-up to the war, we must choose between the options of incompetence, dishonesty, and bullshit?
Please understand that I frame it in such terms, not from any political motive, but purely in the interest of conceptual rigor. 

That said.... It seems to me that this range of terms is inadequate. One may agree that Bush et al. are profoundly indifferent to verifiable truth without concluding that the Frankfurt category necessarily applies.

Per G. A. Cohen’s analysis in “Deeper into Bullshit,” we must stress that Frankfurt’s model rests on a particular understanding of the consciousness of the liar. The mind of the bullshitter is defined by contrast to this state. For the liar, (1) the contrast between truth and untruth is clearly discerned, and (2) that difference would be grasped by the person to whom the liar speaks. But the liar’s intentionality also includes (3) some specific and lucidly grasped advantage over the listener made possible by the act of lying.

By contrast, the bullshitter is vague on (1) and radically unconcerned with (2). There is more work to be done on the elements of relationship and efficacy indicated by (3). We lack a carefully argued account of bullshit’s effect on the bullshitee.

There is, however, another possible state of consciousness not adequately described by Frankfurt’s paper. What might be called “the true believer” is someone possessing an intense concern with truth.

But it is a Higher Truth, which the listener may not (indeed, probably cannot) grasp. The true believer is speaking a truth that somehow exceeds the understanding of the person hearing it.

During the Moscow Trials of the late 1930s, Stalin’s attorney lodged numerous charges against the accused that were, by normal standards, absurd. In many cases, the “evidence” could be shown to be false. But so much worse for the facts, at least from the vantage point of the true believer. If you’ve ever known someone who got involved in EST or a multi-level marketing business, the same general principle applies. In each case, it is not quite accurate to say that the true believers are lying. Nor are they bullshitting, in the strictest sense, for they maintain a certain fidelity to the Higher Truth. 

Similarly, it did not matter three years ago whether or not any evidence existed to link Saddam and Osama. To anyone possessing the Higher Truth, it was obvious that Iraq must be a training ground for Al Qaeda. And guess what? It is now. So why argue about it?

On a less world-historical scale, I see something interesting and apropos in Academe, the magazine of the American Association of University Professors. In the latest issue, David Horowitz makes clear that he is not a liar just because he told a national television audience something that he knew was not true. 

(This item was brought to my attention by a friend who teaches in a state undergoing one of Horowitz’s ideological rectification campaigns. My guess is that he’d rather not be thanked by name.)

Here’s the story so far: In February, while the Ward Churchill debate was heating up, Horowitz appeared on Bill O’Reilly’s program. It came up that Horowitz, like Churchill, had been invited to lecture at Hamilton College at some point. But he was not, he said, “a speaker paid by and invited by the faculty.” 

As we all know, university faculties are hotbeds of left-wing extremism. (Especially the business schools and engineering departments. And reports of how hotel-management students are forced to read speeches by Pol Pot are positively blood-curdling.) Anyway, whenever Horowitz appears on campus, it’s because some plucky youngsters invite him. He was at Hamilton because he had been asked by “the conservative kids.”

That came as a surprise to Maurice Isserman, a left-of-center historian who teaches at Hamilton College. When I saw him at a conference a few years ago, he seemed to have a little gray in his hair, and his last book, The Other American: The Life of Michael Harrington, was a biography of the founder of the Democratic Socialists of America. No doubt he’s been called all sorts of things over the years, but “conservative kid” is not one of them. And when Horowitz spoke at Hamilton a few years ago, it was as a guest lecturer in Isserman’s class on the 1960s. 

As Isserman put it in the September/October issue of Academe: “Contrary to the impression he gave on "The O’Reilly Factor," Horowitz was, in fact, an official guest of Hamilton College in fall 2002, invited by a faculty member, introduced at his talk by the dean of the faculty, and generously compensated for his time.”

I will leave to you the pleasure and edification of watching Horowitz explain himself in the latest issue of Academe. But in short, he could not tell the truth because that would have been a lie, so he had to say something untrue in order to speak a Higher Truth. 

My apologies for the pretzel-like twistiness of that paraphrase. It is all so much clearer in the original Newspeak: Thoughtcrime is doubleplus ungood.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Political science
Back to Top