books

Kaplan CEO's book takes on higher ed's incentive system

Smart Title: 

Andrew S. Rosen, Kaplan's CEO, takes on the traditional view of college with his debut book, arguing that higher education needs a "reboot" to meet America's goals.

Daytona State reins in its push toward e-textbooks

Smart Title: 

Daytona State reins in a plan to push students and faculty toward electronic textbooks.

New book on college students and hip-hop culture

Smart Title: 

Author argues that impact on campus life is significant -- and educators should know more about it.

Review of Edward H. Miller, "Nut Country: Right-Wing Dallas and the Birth of the Southern Strategy"

Trying to explain recent developments in the American presidential primaries to an international audience, a feature in this week’s issue of The Economist underscores aspects of the political landscape common to both the United States and Europe. “Median wages have stagnated even as incomes at the top have soared,” the magazine reminds its readers (as if they didn’t know and had nothing to do with it). “Cultural fears compound economic ones” under the combined demographic pressures of immigration and an aging citizenry.

And then there’s the loss of global supremacy. After decades of American ascent, “Europe has grown used to relative decline,” says The Economist. But the experience is unexpected and unwelcome to those Americans who assumed that the early post-Cold War system (with their country as the final, effectively irresistible superpower) represented the world’s new normal, if not the End of History. The challenges coming from Putin, ISIS and Chinese-style authoritarian state capitalism suggest otherwise.

Those tensions have come to a head in the primaries with the campaigns of Donald Trump and Bernie Sanders: newcomers to their respective parties whose rise seemed far-fetched not so long ago. To The Economist’s eyes, their traction is, if not predictable, then at least explicable: “Populist insurgencies are written into the source code of a polity that began as a revolt against a distant, high-handed elite.”

True enough, as far as it goes. The analysis overlooks an important factor in how “anti-elite” sentiment has been channeled over the past quarter century: through “anti-elitist” tycoons. Trump is only the latest instance. Celebrity, bully-boy charisma and deep pockets have established him as a force in politics, despite an almost policy-free message that seems to take belligerence as an ideological stance. Before that, there was the more discrete largess of the Koch brothers (among others) in funding the Tea Party, and earlier still, Ross Perot’s 1992 presidential campaign, with its folksy infomercials and simple pie charts, which drew almost a fifth of the popular vote. In short, “revolt against a distant, high-handed elite” may be written into the source code of American politics; the billionaires have the incentives and the means to keep trying to hack it.

If anything, even Perot was a latecomer. In the opening pages of Nut Country: Right-Wing Dallas and the Birth of the Southern Strategy (University of Chicago Press), Edward H. Miller takes note of a name that’s largely faded from the public memory: H. L. Hunt, the Texas oilman. Hunt was probably the single richest individual in the world when he died in 1974. He published a mountain of what he deemed “patriotic” literature and also funded a widely syndicated radio program called Life Line. All of it furthered Hunt’s tireless crusade against liberalism, humanism, racial integration, socialized medicine, hippies, the New Deal, the United Nations and sundry other aspects of the International Communist Conspiracy, broadly defined. ("Nut country" is how John F. Kennedy described Dallas to the first lady a few hours before he was killed.)

Hunt’s output was still in circulation when I grew up in Texas a few years after his death, and reading it has meant no end of déjà vu in the meantime: the terrible ideas of today are usually just the terrible ideas of yesterday, polished with a few updated topical references. Miller, an assistant teaching professor of history at Northeastern University Global, reconstructs the context and the mood that made Dallas a hub of far-right political activism between the decline of Joseph McCarthy and the rise of Barry Goldwater -- a city with 700 members of the John Birch Society. A major newspaper, The Dallas Morning News, helped spur the sales of a book called God, the Original Segregationist by running excerpts. Cold War anti-Communism mutated into a belief that the United States and the Soviet Union were in the process of being merged under the direction of the United Nations, in the course of which all reference to God would be outlawed. John F. Kennedy was riding roughshod over American liberties, bypassing Congress and establishing a totalitarian dictatorship in which, as H. L. Hunt warned, there would be “no West Point, no Annapolis, no private firearms -- no defense!”

An almost Obama-like dictatorship, then. Needless to say, these beliefs and attitudes are still with us, even if many of the people who espoused them are not.

Miller identifies two tendencies or camps within right-wing political circles in Dallas during the late 1950s and early ’60s. “Moderate conservatism” was closer to established Republican Party principles of free enterprise, unrelenting anti-Communism and the continuing need to halt and begin rolling back the changes brought by the New Deal. Meanwhile, “ultraconservatism” combined a sense of apocalyptic urgency with fear of all-pervasive subversion and conspiracy. A reader familiar with recent laments about the state of the Republican Party -- that it was once a much broader tent, with room for even the occasional liberal -- might well assume that Miller’s moderate conservatives consisted of people who liked Ike, hated Castro and otherwise leaned to a bit to the right wing of moderation, as opposed to ultraconservative extremism.

That assumption is understandable but largely wrong: Miller’s moderates were much closer to his ultras than either was to, say, the Eisenhower who sent federal troops to Little Rock, Ark. (Or as someone the author quotes puts it, the range of conservative opinion in Dallas was divided between those who wanted to impeach Supreme Court Justice Earl Warren and those who wanted to hang him.)

Where the difference between the moderates and the ultras ultimately combined to create something more durable and powerful than either of them could be separately was in opposition to the Civil Rights movement and their realignment of the segregationist wing of the Democratic Party. I’ll come back to Miller’s argument on this in a later column, once the primary season has progressed a bit. Suffice it to say that questions of realignment are looking a little antiquarian all the time.

Editorial Tags: 

James Stein discusses new short story collection 'L.A. Math'

Smart Title: 

Math professor and author discusses new short story collection that blends his discipline with fiction.

Essay on David Bowie

There’s a special rung of hell where the serious and the damned writhe in agony, gnashing their teeth and cursing their fate, as they watch an endless marathon of historical documentaries from basic cable networks. Their lidless eyes behold Ancient Aliens, now in its tenth season, and High Hitler, which reveals that the Führer was a dope fiend. The lineup includes at least one program about the career of each and every single condemned soul in the serial-killer department, which is a few rungs down.

In the part of the inferno where I presumably have reservations, a lot of the programming concerns the history of rock music. With each cliché, a devil pokes you, just to rub it in. The monotonous predictability of each band’s narrative arc (struggle, stardom, rehab, Hall of Fame) is just part of it, since there are also the talking-head commentaries, interspersed every few minutes, by people unable to assess any aspect of the music except through hyperbole. Each singer was the voice of the era. Every notable guitarist revolutionized the way the instrument was played -- forever. No stylistic innovation failed to change the way we think about music, influencing all that followed.

Even the devils must weary of it, after a while. It probably just makes them meaner.

Here on earth, of course, such programming can be avoided. Choose to watch Nazi UFOs -- an actual program my TiVo box insists on recording every time it runs -- and you have really no one to blame but yourself.

But David Bowie’s death earlier this month left me vulnerable to the recent rerun of a program covering most of his life and work. Viewing it felt almost obligatory: I quit keeping track of Bowie’s work in the early 1980s (a pretty common tendency among early devotees, the near-consensus opinion being that he entered a long downturn in creativity around that time) so that catching up on Bowie’s last three decades, however sketchily, seemed like a matter of paying respects. It sounds like his last few albums would be worth a chance, so no regrets for watching.

Beyond that, however, the program offered only the usual insight-free superlatives -- echoes of the hype that Bowie spent much of his career both inciting and dismantling. Bowie had a precocious and intensely self-conscious relationship to mass media and spectacle. He was, in a way, Andy Warhol’s most attentive student. That could easily have led Bowie down a dead end of cynicism and stranded him there, but instead it fed a body of creative activity -- in film and theater as well as music -- that fleshed out any number of Oscar Wilde’s more challenging paradoxes. (A few that especially apply to Bowie’s career: “To be premature is to be perfect.” “One should either be a work of art or wear a work of art.” “Man is least himself when he talks in his own person. Give him a mask, and he will tell you the truth.”) There must be a whole cohort of people who lived through early theoretical discussions of postmodernism and performativity while biting our tongues, thinking that an awful lot of it was just David Bowie, minus the genius.

“Genius” can be a hype word, of course, but the biggest problem with superlatives in Bowie’s case isn’t that they are clichéd but that they’re too blunt. Claim that Bowie invented rock stardom, as somebody on TV did, for example, and the statement is historically obtuse while also somehow underestimating just how catalytic an impact he had.

As noted here in a column some months ago, Bowie is not among the artists David Shumway wrote about in Rock Star: The Making of Musical Icons from Elvis to Springsteen (Johns Hopkins University Press, 2014). And yet one aspect of Bowie’s career often taken as quintessential, his tendency to change appearances and styles, actually proves to be one of the basic characteristics of the rock star’s cultural role, well before his Thin White Duke persona rose from the ashes of Ziggy Stardust. Context undercuts the hype.

Elsewhere, in an essay for the edited collection Goth: Undead Subculture (Duke, 2007), Shumway acknowledges that Bowie did practice a kind of theatricalization that created a distinctive relationship between star and fan: the “explicit use of character, costume and makeup … moved the center of gravity from the person to the performance” in a way that seemed to abandon the rock mystique of authenticity and self-expression in favor of “disguising the self” while also reimagining it.

“His performances taught us about the constructedness of the rock star and the crafting of the rock performance,” Shumway writes. “His use of the mask revealed what Dylan’s insistence on his own authenticity and Elvis’s swagger hid.”

At the same time, Bowie’s decentered/concealed self became something audiences could and did take as a model. But rather than this being some radical innovation that transformed the way we think about rock forever (etc.), Shumway suggests that Bowie and his audience were revisiting one of the primary scenes of instruction for 20th-century culture as a whole: the cinema.

Bowie “did not appear to claim authenticity for his characters,” Shumway writes. “But screen actors do not claim authenticity for the fictional roles they play either. Because he inhabits characters, Bowie is more like a movie star than are most popular music celebrities. In both cases the issue of the star’s authenticity is not erased by the role playing, but made more complex and perhaps more intense.”

That aptly describes Bowie’s effect. He made life “more complex and perhaps more intense” -- with the sound of shattering clichés mixed into the audio track at unexpected moments. And a personal note of thanks to Shumway for breaking some, too.

Editorial Tags: 
Image Source: 
Getty Images
Image Caption: 
Memorial to David Bowie

Book review of Hugh Pennington's "Have Bacteria Won?" (essay)

Last month came the unwelcome if not downright chilling news that the antibiotic of last resort -- the most powerful infection fighter in the medical arsenal -- is now ineffective against some new bacterial strains. If, like me, you heard that much and decided your nerves were not up to learning a lot more, then this might be a good time to click over to see what else looks interesting in the Views section menu. There’s something to be said for deliberate obliviousness on matters that you can’t control anyway.

Hugh Pennington’s Have Bacteria Won? (Polity) is aimed straight at the heart of a public anxiety that has grown over the past couple of decades. The author, an emeritus professor of bacteriology at the University of Aberdeen, is clearly a busy public figure in the United Kingdom, where he writes and comments frequently on medical news for the media. A number of recent articles in British newspapers call him a “leading food-poisoning expert,” but that is just one of Pennington’s areas of expertise. Besides contributing to the professional literature, he has served on commissions investigating disease outbreaks and writes “medico-legal expert witness reports” (he says in the new book) on a regular basis.

The fear resonating in Pennington’s title dates back to the mid-1990s. Coverage of the Ebola outbreak in Zaire in 1995 seemed to compete for attention with reports of necrotizing fasciitis (better known as “that flesh-eating disease”), which inspired such thought-provoking headlines as “Killer Bug Ate My Face.”

Pennington refers to earlier cases of food contamination that generated much press coverage -- and fair enough. But it was the ghastly pair of hypervirulent infections in the news 20 years ago that really raised the stakes of something else that medical researchers were warning us about: the widespread overuse of antibiotics. It was killing off all but the most resilient disease germs. An inadvertent process of man-made natural selection was underway, and the long-term consequences were potentially catastrophic.

But now for the good news, or the nonapocalyptic news, anyway: Pennington makes a calm assessment of the balance of forces between humanity and bacteria and, without being too Pollyannaish about it, suggests that unpanicked sobriety would be a good attitude for the public to cultivate, as well.

The history of medical advances in the industrialized world has, he argues, had unexpected and easily overlooked side effects. Now we live, on average, longer than our ancestors. But we also die for different reasons, with mortality from infection no longer being high on the list. The website of the Centers for Disease Control and Prevention makes the point sharply with a couple of charts: apart from a spike during the influenza pandemic following the First World War, death from infectious disease fell in the United States throughout most of the 20th century. Pennington’s point is that we find this trend throughout the modernized world, wherever life expectancy increased. Medical advances, including the development of antibiotics, played a role, but not in isolation. Improved sanitation and increased agricultural output were also part of it.

“There is a pattern common to rich countries,” Pennington notes. “The clinical effects of an infection become much less severe long before specific control measures or successful treatments become available. Their introduction then speeds up the decline, but from a low base. An adequate diet brings this about.”

So death from infectious disease went from being a terrible fate to something practically anomalous within two or three generations. (To repeat, we’re talking about the developed world here: both prosperity and progress impose blinders.) And when serious infectious disease become rare, it also becomes news. “From time to time,” Pennington says, “the media behave like a chief refracting telescope, focusing on an object of interest but magnifying it with a good deal of aberration and fuzziness at the edges because of the poor quality of their lenses.”

Lest anyone think that the competitive shamelessness of the British tabloid press has excessively distorted Pennington’s outlook, keep in mind that CNN once had a banner headline reading, “Ebola: ‘The ISIS of Biological Agents?’” Nor does he demonize the mass media, as such. “Sometimes the journalistic telescope finds hidden things that should be public,” he writes -- giving as an example how a local newspaper identified and publicized an outbreak of infectious colitis at an understaffed and poorly run hospital in Scotland.

Have Bacteria Won? is packed with case histories of outbreaks from the past 60 or 70 years. Each is awful enough in its own right to keep the reader from feeling much comfort at their relative infrequency, and Pennington’s message certainly isn’t that disease can be eradicated. Powerful and usually quite effective techniques exist to prevent or minimize bacterial contamination of food and water, and we now have systematic ways to recognize and treat a wider range of infections than would have been imaginable not that long ago. But systems fail (he mentions several cases of defective pasteurization equipment causing large-scale outbreaks) and bacteria mutate without warning. “Each microbe has its own rules,” Pennington writes. “Evolution has seen to that.”

We enjoy some advantage, given our great big brains, especially now that we have the tools of DNA sequencing and ever-increasing computational power. "This means," Pennington writes, "that tracking microbes, understanding their evolution and finding their weaknesses gets easier, faster and cheaper every day." Given reports that the MCR-1 gene found in antibiotic-impervious bacteria can move easily between micro-organisms, any encouraging word is welcome right about now.

But Pennington's analysis also implies that the world's incredible and even obscene disparities in wealth are another vulnerability. "An adequate diet" for those who don't have it seems like something all that computational power might also be directed toward. Consider it a form of preventative medicine.

Editorial Tags: 

Essay on Wikipedia's fifteenth anniversary

Wikipedia came into the world 15 years ago today -- and, man, what an ugly baby. The first snapshot of it in the Internet Archive is from late March of 2001, when Wikipedia was already 10 weeks old. At that point, it claimed to have more than 3,000 pages, with an expressed hope of reaching 10,000 by the end of summer and 100,000 at some point in the not unimaginably distant future. The first entries were aspirational, at best. The one about Plato, for example, reads in its entirety: “Famous ancient Greek philosopher. Wrote that thing about the cave.”

By November -- with Wikipedia at 10 months old -- the entry on Plato was longer, if not more enlightening: you would have learned more from a good children’s encyclopedia. Over the next several months, the entry grew to a length of about 1,000 words, sometimes in classically padded freshman prose. (“Today, Plato's reputation is as easily on a par with Aristotle's. Many college students have read Plato but not Aristotle, in large part because the former's greater accessibility.”) But encouraging signs soon began to appear. A link directed the reader to supplementary pages on Platonic realism, for example. As of early 2006, when Wikipedia turned five years old, the main entry on Plato had doubled in length, with links to online editions of his writings. In addition, separate pages existed for each of the works -- often consisting of just a few sentences, but sometimes with a rough outline of the topics to be covered in a more ambitious entry somewhere down the line.

The aspirations started to look more serious. There were still times when Wikipedia seemed designed to give a copy editor nightmares -- as in 2003, when someone annotated the list of dialogues to indicate: “(1) if scholars don't generally agree Plato is the author, and (2) if scholars don't generally disagree that Plato is not the author of the work.”

Yet it is also indicative of where the site was heading that before long some volunteer stepped in to unclog that passage's syntactical plumbing. The site had plenty of room for improvement -- no denying it. On the other hand, improvements were actually happening, however unsystematically.

The site hit its initial target of 100,000 pages in early 2003 -- at which point it began to blow up like a financial bubble. There were not quite one million pages by the fifth anniversary of its founding and 3.5 million by the tenth. Growth has slowed of late, with an average of about 300,000 pages being added annually over the past five years.

I draw these figures from Dariusz Jemielniak’s Common Knowledge? An Ethnography of Wikipedia (Stanford University Press, 2014), which also points out how rapidly the pace of editorial changes to articles began to spike. Ten million edits were made during Wikipedia’s first four years. The next 10 million took four months. From 2007 on, the frequency of edits stabilized at a rate of 10 million edits per seven or eight weeks.

We could continue in this quantifying vein for a while. As with the Plato entry finding its center of gravity after a long period of wobbly steps, the metrics for Wikipedia tell a story of growth and progress. So does the format’s worldwide viability: Wikipedia is now active in 280 languages, of which 69 have at least 100,000 entries. It all still seems improbable and inexplicable to someone who recalls how little credibility the very concept once had. (“You can edit this page right now! … Write a little (or a lot) about what you know!”) If someone told you in 2002 that, in 10 years, the Encyclopædia Britannica would suspend publication of its print edition -- while one of the world’s oldest university presses would be publishing material plagiarized from Wikipedia, rather than by it -- the claim would have sounded like boosterism gone mad.

That, or the end of civilization. (Possibly both.) What’s in fact happened -- celebrate or mourn it as you will -- has been a steady normalization of Wikipedia as it has metamorphosed from gangly cultural interloper into the de facto reference work of first resort.

In large measure, the transformation came about as part what Siva Vaidhyanathan has dubbed “the Googlization of everything.” Wikipedia entries normally appear at or near the top of the first page of the search engine’s results. After a while, the priority that the Google algorithm gives to Wikipedia has come to seem natural and practically irresistible. At this point, having a look at Wikipedia usually quicker and easier than deciding not to (as someone once said about reading the comic strip “Nancy”).

Another sign of normalization has been the development of bibliographical norms for citing Wikipedia in scholarship. It signals that the online reference work has become a factor in knowledge production -- not necessarily as a warehouse of authoritative information but as a primary source, as raw material, subject to whatever questions and methods a discipline may bring to bear on it.

In the case of that Plato entry, the archive of changes over time would probably be of minimal interest as anything but a record of the efforts of successively better informed and more careful people. But Wikipedia’s role as a transmitter of information and an arena for contesting truth claims make its records a valuable source for people studying more recent matters. Someone researching the impact of the Sandy Hook Elementary School shootings, for example, would find in the Wikipedia archive a condensed documentation of how information and arguments about the event appeared in real time, both in its immediate aftermath and for years afterward.

I've been reading and writing about Wikipedia for this column for most of its lifespan, and it won't be five years before there's occasion to do so again. There's plenty more to say. But for now, it seems like Professor Wikipedia should get the last word.

Editorial Tags: 

Recommended books that give career advice for Ph.D.s (essay)

Category: 

Natalie Lundsteen shares a shortlist of standout books providing valuable guidance to Ph.D.s engaged in a career search.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Author discusses physics envy book about cold war poetry

Smart Title: 

Author discusses new book about poetry and science during the Cold War.

Pages

Subscribe to RSS - books
Back to Top