Interview with the author of new book on Mexican-American students

Smart Title: 

Author discusses new book on the challenges facing Mexican-American students.

Mellen Drops Suit Against Librarian

Edwin Mellen Press is dropping a wildly unpopular libel lawsuit against a university librarian, CBC News reported. Mellen sued Dale Askey, associate university librarian at McMaster University in Ontario, where he’s been working since 2011, over a blog post he wrote in August 2010, when he was at Kansas State University, that was highly critical of Mellen. The press says that it wants to focus on its authors and books and so is dropping the suit. Many scholarly and library groups were furious about the lawsuit and criticized Mellen for filing it.


Ad keywords: 

Colleges try to beat textbook costs with book reserves

Smart Title: 

To lessen the impact of rising textbook costs, three institutions have created programs that allow students to borrow course materials.

Review of Jeremy Bernstein, "A Palette of Particles"

Intellectual Affairs

At the end of "The Incredible Shrinking Man" (1957), our unfortunate hero -- having survived encounters with a house cat and a spider on the way down -- finds himself smaller than an atom, with no end in sight. We see him awaken on what looks like a planet, made up (one reckons) of even tinier atoms. Which in turn contain worlds, containing atoms, and so on.

A mystical epiphany seems totally appropriate under the circumstances. "Smaller than smallest,” he says in the closing voice-over, “I meant something too. To God there is no zero. I still exist!" A galaxy fills the screen: vaster, and more infinitesimal, than the viewer can possibly imagine. A psychedelic moment, with the 1960s not even started yet.

It works, in part, because the audience has seen the standard textbook drawing of an atom, with electrons orbiting the nucleus like planets around the sun. (A lumpy sun, to be sure, made of protons and neutrons.) All of the particles are little spheres. The obvious parallel to a solar system feels sublimely appropriate. To use the maxim alchemists once learned from Hermes Trismegistus: “As above, so below.”

But the parallelism, while convenient, is misleading. Electrons resemble clouds more than they do the billiard-ball planets in a science-fair exhibit. Protons and neutrons are waves as much as they are particles. And there’s scarcely any point to attempting a visual rendering of the still more elementary components of matter that Jeremy Bernstein writes about in A Palette of Particles (Harvard University Press). Apart from vintage photographs in which scientists discerned the trails left by a positron or an Omega-minus particle on the move, most of the entities Bernstein writes about are best “depicted” as mathematical formulae.

A professor emeritus of physics at the Stevens Institute of Technology, Bernstein is a prolific author of books on science for the lay reader -- and he brings to this popular history of particle physics the advantage of having been around when some of that history was being made. Bernstein, now in his 80s, knew Wolfgang Pauli, who hypothesized the existence of the neutrino in 1930, a quarter-century before it could be confirmed. (He also came up with two devastating remarks sometimes appropriated by people who haven’t heard of Pauli. One was to say of a theory that it “wasn’t even wrong.” The other was to refer to a colleague as “so young and already so unknown.”)

Particle physics has become staggeringly expensive (the search for the Higgs boson or “God particle” cost more than $13 billion) but Bernstein entered the field when budgets, like computation speeds, were a lot lower. He mentions being “the house theorist for the Harvard Cyclotron from 1955 to 1957,” when the machine and the building to house it “cost something like half a million dollars.”

And the old venues had their charm: he expresses a certain fondness for the Cosmotron, a particle accelerator that went into operation at Brookhaven National Laboratory in the early 1950s. “I was on the theoretical staff at Brookhaven for a couple of years in the 1960s,” he writes. “When the machine was down I used to go into the building at night to practice my trumpet. The acoustics were wonderful.”

His personal observations help ground what can prove a mind-bending tour of the infinitesimal. Physicists have discovered a whole menagerie of subatomic entities since James Chadwick identified the neutron in 1932. Some of them have hard-to-grasp qualities such as zero mass, or power that increases with distance. The distinctions among them involve terms such as “spin” or “color” that bear little or no relation to what they mean in ordinary usage.

Furthermore, particles are related to one another in various ways, and there are symmetries (and anomalous asymmetries) between them as well. Keeping it all straight is like remembering who’s who in a Russian novel.

Not a complaint, let me hasten to say: Bernstein covers the material in a sprightly manner, with only the occasional equation that will reveal the beauty of it all to the reader who can grasp it. And he takes a quick look at hypothetical particles that sound like something out of a sci-fi flick. One is the graviton: a gravitational quantum with no mass that moves at the speed of light. Another is the tachyon, which cannot move slower than the speed of light.

If tachyons do exist and could be used to transmit information (so goes the speculation) it might be possible to reverse cause and effect – to go backward in time. Bernstein does not sound optimistic about anyone proving the existence of the tachyon. Even so, the search is on. (Imagine the day that breakthrough is made. What could possibly go wrong?)

A Palette of Particles ends by comparing the domain of subatomic particles to “a series of nested Russian dolls: inside each one there is another.” Add to that the estimate by physicists that 85 percent of the matter in the universe consists of subatomic particles we don’t recognize or understand yet…. It turns out that Bernstein’s sober and lucid introduction to particle physics has an almost mystical quality, even if the author shows no interest in that kind of cosmic thinking. We’re back to what the incredible shrinking man tells us:

“So close, the Infinitesimal and the Infinite. But suddenly I knew they were the really the two ends of the same concept. The unbelievably small and the unbelievably vast eventually meet like the closing of a gigantic circle.”

Editorial Tags: 

Review of Michael D. Gordin, 'The Pseudoscience Wars'

Once it would have been possible to jump right into a discussion of Michael Gordin’s The Pseudoscience Wars: Immanuel Velikovsky and the Birth of the Modern Fringe (University of Chicago Press) with the reasonable assumption that readers would have at least a nodding acquaintance with the maverick psychoanalyst’s ideas.

But today -- as Gordin, a professor of history at Princeton University, notes -- few people under the age of 50 will recognize Velikovsky’s name, much less know of his theory of the traumatic impact of cosmic catastrophes on human history. It was a heated topic for discussion in the 1970s. I recall seeing a poster for a meeting of Chaos and Chronos, a student organization dedicated to Velikovskian matters that once had clubs on many U.S. college campuses. This was as late as 1980 or ’81. (Which only corroborates Gordin’s point, for I am approaching the half-century mark at an alarming speed.)

So, first, a lesson in now-dormant controversy.

Although he published several other books during his lifetime, plus a few more posthumously, Velikovsky presented his core argument in a volume called Worlds in Collision (1950). It was an attempt to formulate the key to all mythologies, or at least an explanation of some of the more striking stories and beliefs of antiquity. Drawing on sources both classical and obscure, he showed that cultures all over the world preserved narratives in which the world passed through incredible catastrophes: the earth shook, the heavens darkened, the sun stood still, floods wiped out society, fire or stones or both fell from the sky, and so on. The cultures that preserved the tales explained the events as a manifestation of God’s wrath at humanity, or as the consequence of gods’ behavior toward one another.

An orthodox Freudian, Velikovsky had no use for Jung’s nebulous ideas about archetypes in the collective unconscious. His theory was more concrete, if no less strange. The far-flung legends all made sense as distorted accounts of a series of astronomical anomalies beginning circa 1500 B.C.E., when (he argued) a huge mass of matter broke off the planet Jupiter and spun off into space. It passed dangerously close to Earth a couple of times before eventually settling into orbit as the planet we now know as Venus.

Its comet-like transit through the solar system generated a series of events, both in outer space and here below, that continued for the better part of a thousand years. Earth and proto-Venus came near enough to affect each other’s orbits, and that of Mars as well. Bewildered by the strange things happening in the sky, mankind endured terrestrial upheaval on an incredible scale -- tectonic disasters, weird weather, and shifts of the planet’s axis, for example.

Once, when proto-Venus came close to Earth, its atmosphere permeated our own long enough to precipitate a fluffy, snow-like substance made of hydrocarbons. And so it came to pass that the Israelites received the manna falling from heaven that the Lord did send to nourish them.

Well, it’s a theory, anyway. Harper’s magazine ran an article about Velikovsky’s book in advance of its publication. Other, less sober publications followed suit, presenting Worlds in Collision as demonstrating the literal (albeit distorted) truth of events recorded in the Bible. The response by scientists was less enthusiastic, to put it mildly. The word “crackpot” tended to come up. Velikovsky’s interdisciplinary erudition impressed them only as evidence that he was profoundly ignorant in a number of fields. The American people would be dumber for reading the book, and so on.

Upon seeing the early publicity for Velikovsky’s book, some scientists were so disgusted that they wrote to Velikovsky’s publisher, Macmillan, to complain. Worlds in Collision had been listed in the firm’s catalog as a scientific work. The letter-writers considered this disgraceful, and warned of the potential damage to the press’s reputation in the scientific community. After a few university science departments canceled their meetings with Macmillan’s textbook salesmen, the publisher became alarmed and sold its right to the book to Doubleday.

Velikovsky was unhappy about this, but the deal was hard on Macmillan as well. At its peak, Worlds in Collision was selling a thousand copies a week, despite being a rather pricey hardback. The backstage furor soon died down, as did public interest in Velikovsky’s claims. By 1951, his ideas must have seemed as if they would have no more of a future than the other big fad of the previous years, L. Ron Hubbard’s Dianetics. (The scholarly literature seems to have overlooked this bit of synchronicity, though I’m sure there is a master’s thesis in it for somebody.)

The Worlds in Collision affair might have been forgotten entirely if not for a special issue of the journal American Behavioral Science devoted to the whole matter, published in 1963. The contributors were interested not so much in Velikovsky’s ideas as in how scientists had responded to them – with peremptory dismissals based on the Harper’s article, emotional rhetoric, and behind-the-scenes pressure on his publisher. It amounted to censorship and the repression of ideas – the assertion of scientific authority against a theory, despite the lack of serious engagement with the book itself.

Velikovsky and Albert Einstein both lived in Princeton, N.J., during the 1950s, and Velikovsky could quote remarks from the physicist’s letters and conversation suggesting that Worlds in Collision was at least interesting and worthy of a hearing. This was by no means the only thing Einstein had to say. Gordin quotes a number of occasions when Einstein described Velikovsky as “crazy” -- and clearly he regarded the man as a pest, at times. But it's not difficult to imagine why the most famous Jewish immigrant in postwar America might develop sympathetic feelings for someone of a comparable background who seemed to be facing unfair persecution. Besides, they could speak German together. That counted for a lot.

In any case, their friendship also made it easier to argue that Velikovsky just might be too far ahead of his time. In the mid-1960s, students at Princeton University formed a discussion group on Worlds in Collision, and Velikovsky himself spoke there – the first of what became many lectures to packed halls. Given the spirit of the time, having been rejected and anathemized by the scientific establishment was, in its own way, a credential. Among young people, Velikovsky enjoyed the special authority that comes when mention of one’s ideas is sufficient to annoy, very noticeably, one’s professors.

In 1972, the editors of Portland State University’s student magazine Pensée turned it into a forum defending and developing Velikovsky’s ideas. Papers were peer-reviewed, sort of: they were submitted to scholars and scientists for vetting, though most of the reviewers were sympathetic to Velikovsky (and, it sounds like, also contributors). Pensée’s first all-catastrophism issue clearly met a need. It had to be reprinted twice and sold 75,000 copies, after which the journal’s circulation settled down to a still-remarkable 10-20,000 copies per year.

And if any more evidence of his status as countercultural eminence were needed, the American Association for the Advancement of Science held a Velikovsy symposium at its annual conference in February 1974. The most famous participant was the astronomer Carl Sagan, who challenged the author’s supposed scientific evidence for the cosmic-catastrophe scenario at considerable length. Velikovsky and his supporters were angry that all of the invited speakers were critical of his work. But the organizers invited Velikovsky himself to respond, which he did, also at considerable length. The symposium may not have vindicated Velikovsky, but it gave him an unusually prominent place at the table

He died in 1979, and five years later Henry Bauer (now an emeritus professor of chemistry and science studies at Virginia Tech) published Beyond Velikovsky: The History of a Public Controversy (University of Illinois Press). It was the first book-length analysis of the whole saga and, for a long time, the last. Most of the secondary literature on Velikovsky appearing since his death resembles the material about him published during his lifetime, in that it is polemical, for or against. The one published biography of Velikovsky that I know of, drawing on his own memoirs, is by his daughter.

So Gordin’s The Pseudo-Science Wars belongs to the fairly small number of studies that do not simply pour the old controversies into new bottles. In that regard, the title is something of a fake-out. The author doesn’t treat Velikovsky’s catastrophism as a variety of pseudoscience. He is dubious about the concept, both because it is applied to too many phenomena that don’t share anything (what do astrology, cold fusion, biorhythms, and the study of how ancient astronauts shaped human evolution really have in common?) and because no one has established an epistemological “bright line” to distinguish science-proper from its pretenders. The word’s real significance lies in its use in shoring up the authority of those who use it. Calling something pseudoscience is more profoundly delegitimizing than calling it bad science.

Happily the author spends only a little time on Sociology of Deviance 101-type labeling theory before getting down to the altogether more compelling labor of using archival material that was unavailable to Bauer 30 years ago -- especially the theorist’s personal papers, now in Princeton’s collection. Velikovsky was something of a packrat. If he ever parted with a document, it cannot have been willingly. The Pseudo-Science Wars fills in the familiar outline of his career and controversy, as sketched above, with an abundance of new detail as well as insight into what Gordin calls “the development of Velikovskian auto-mythology.”

We learn, for one thing, that tales of a deliberate campaign of letter-writing and well-organized pressure on Macmillan through the threat of a boycott have little evidence to back them up. Accounts treating Velikovsky as an American Galileo typically suggest that his opponents wanted to prevent his ideas from receiving any hearing at all – that they were, in principle if not in method, book-burners.

But the existing documents suggest that the scientific community was chiefly troubled at seeing Worlds in Collision issued under the full authority of a major science and textbook publisher. A number of scientists responded by exerting pressure on Macmillan, but Gordin says the letters “were disorganized, uncoordinated, and threatened different things – some not to buy books, some not to referee manuscripts, others not to write them.”

Textbooks represented up to 70 percent of the publisher’s revenue, so professorial displeasure “had to be taken seriously. Macmillan could not afford to call it a bluff.” Once Worlds in Collision was sold to Doubleday (a trade publisher) scientists were content to mock the author’s grasp of geology, chemistry, celestial mechanics, etc. – or simply to ignore the book altogether.

Velikovsky converted the episode into a kind of moral capital, and Gordin demonstrates how shrewdly he and his admirers used it to build a scientific counter-establishment -- what one might otherwise call a full-scale pseudoscience. The analysis requires a number of detours, heading into territory where intellectual historians seldom venture – as in the sad tale of Donald Wesley Patten, author and publisher of The Biblical Flood and the Ice Epoch (1966), who ultimately proved too Velikovskian for the fundamentalists, and vice versa.

For a long time it seemed as if no one could go beyond Beyond Velikovsky. Gordin's book does not replace the earlier study, which remains an interesting and valuable book, and certainly worth the attention of anyone trying to decide whether to explore the terrain in more detail. But The Pseudoscience Wars puts the catastrophist’s ideas and aura into a wider and thicker context of ideas, people, and institutions -- a remarkable array, spanning from the Soviet genetics debates of the 1940s to today's fractious niche of (please accept my sincere apology for this next word) post-Velikovskyism.

Speaking of which, let me end with a prediction. While reading Gordin, it crossed my mind that the scenario of upheaval in Worlds in Collision might well speak to the sense of how precarious our little ball in space really is. Americans are not a thrifty people, but we do tend to recycle our cultural phenomena, and if there is one 20th-century idea that seems a likely candidate for 21st-century revival, it is probably catastrophism.

Editorial Tags: 

New book on the "professional shift" in the discipline of history

Smart Title: 

Author discusses new book on history's evolution as a discipline and what that means for the field today.

Review of Jason Dittmer, "Captain America and the Nationalist Superhero: Metaphors, Narratives, and Geopolitics"

Intellectual Affairs

In an essay first published in 1948, the American folklorist and cultural critic Gershon Legman wrote about the comic book -- then a fairly recent development -- as both a symptom and a carrier of psychosexual pathology. An ardent Freudian, Legman interpreted the tales and images filling the comics’ pages as fantasies fueled by the social repression of normal erotic and aggressive drives. Not that the comics were unusual in that regard: Legman’s wider argument was that most American popular culture was just as riddled with misogyny, sadomasochism, and malevolent narcissism. And to trace the theory back to its founder, Freud had implied in his paper “Creative Writers and Daydreaming” that any work of narrative fiction grows out of a core of fantasy that, if expressed more directly, would prove embarrassing or offensive. While the comic books of Legman’s day might be as bad as Titus Andronicus – Shakespeare’s play involving incest, rape, murder, mutilation, and cannibalism – they certainly couldn’t be much worse.

But what troubled Legman apart from the content (manifest and latent, as the psychoanalysts say) of the comics was the fact that the public consumed them so early in life, in such tremendous quantity. “With rare exceptions,” he wrote, “every child who was six years old in 1938 has by now absorbed an absolute minimum of eighteen thousand pictorial beatings, shootings, stranglings, blood-puddles, and torturings-to-death from comic (ha-ha) books alone, identifying himself – unless he is a complete masochist – with the heroic beater, strangler, blood-letter, and/or torturer in every case.”

Today, of course, a kid probably sees all that before the age of six. (In the words of Bart Simpson, instructing his younger sister: “If you don't watch the violence, you'll never get desensitized to it.”) And it is probably for the best that Legman, who died in 1999, is not around to see the endless parade of superhero films from Hollywood over the past few years. For in the likes of Superman, he diagnosed what he called the “virus” of a fascist worldview.

The cosmos of the superheroes was one of “continuous guilty terror,” Legman wrote, “projecting outward in every direction his readers’ paranoid hostility.” After a decade of supplying Superman with sinister characters to defeat and destroy, “comic books have succeeded in giving every American child a complete course in paranoid megalomania such as no German child ever had, a total conviction of the morality of force such as no Nazi could even aspire to.”

A bit of a ranter, then, was Legman. The fury wears on the reader’s nerves. But he was relentless in piling up examples of how Americans entertained themselves with depictions of antisocial behavior and fantasies of the empowered self. The rationale for this (when anyone bothered to offer one) was that the vicarious mayhem was a release valve, a catharsis draining away frustration. Legman saw it as a brutalized mentality feeding on itself -- preparing real horrors through imaginary participation.

Nothing so strident will be found in Jason Dittmer’s Captain America and the Nationalist Superhero: Metaphors, Narratives, and Geopolitics (Temple University Press), which is monographic rather than polemical. It is much more narrowly focused than Legman’s cultural criticism, while at the same time employing a larger theoretical toolkit than his collection of vintage psychoanalytic concepts. Dittmer, a reader in human geography at University College London, draws on Homi Bhabha’s thinking on nationalism as well as various critical perspectives (feminist and postcolonial, mainly) from the field of international relations.

For all that, the book shares Legman’s cultural complaints to a certain degree, although none of his work is cited. But first, it’s important to stress the contrasts, which are, in part, differences of scale. Legman analyzed the superhero as one genre among others appealing to the comic-book audience -- and that audience, in turn, as one sector of the mass-culture public. 

Dittmer instead isolates – or possibly invents, as he suggests in passing – a subgenre of comic books devoted to what he calls “the nationalist superhero.” This character-type first appears, not in 1938, with the first issue of Superman, but in the early months of 1941, when Captain America hits the stands. Similar figures emerged in other countries, such as Captain Britain and (somewhat more imaginatively) Nelvana of the Northern Lights, the Canadian superheroine. What set them apart from the wider superhero population was their especially strong connection with their country. Nelvana, for instance, is the half-human daughter of the Inuit demigod who rules the aurora borealis. (Any relationship with actual First Nations mythology here is tenuous at best, but never mind.)

Since Captain America was the prototype –- and since many of you undoubtedly know as much about him as I did before reading the book, i.e., nothing – a word about his origins seems in order. Before becoming a superhero, he was a scrawny artist named Steve Rogers who followed the news from Germany and was horrified by the Nazi menace. He tried to join the army well before the U.S entered World War Two but was rejected as physically unfit. Instead, he volunteered to serve as a human guinea pig for a serum that transforms him into an invincible warrior. And so, as Captain America -- outfitted with shield and spandex in the colors of Old Glory – he went off to fight Red Skull, who was not only a supervillain but a close personal friend of Adolf Hitler.  

Now, no one questions Superman’s dedication to “truth, justice, and the American way,” but the fact remains that he was an alien who just happened to land in the United States. His national identity is, in effect, luck of the draw. (I learn from Wikipedia that one alternate-universe narrative of Superman has him growing up on a Ukrainian collective farm as a Soviet patriot, with inevitable consequences for the Cold War balance of power.) By contrast, Dittmer’s nationalist superhero “identifies himself or herself as a representative and defender of a specific nation-state, often through his or her name, uniform, and mission.”

But Dittmer’s point is not that the nationalist superhero is a symbol for the country or a projection of some imagined or desired sense of national character. That much is obvious enough. Rather, narratives involving the nationalist superhero are one part of a larger, ongoing process of working out the relationship between the two entities yoked together in the term “nation-state.”

That hyphen is not an equals sign. Citing feminist international-relations theorists, Dittmer suggests that one prevalent mode of thinking counterposes “the ‘soft,’ feminine nation that is to be protected by the ‘hard,’ masculine state” -- which is also defined, per Max Weber, as claiming a monopoly on the legitimate use of violence. From that perspective, the nationalist superhero occupies the anomalous position of someone who performs a state-like role (protective and sometimes violent) while also trying to express or embody some version of how the nation prefers to understand its own core values.

And because the superhero genre in general tends to be both durable and repetitive (the supervillain is necessarily a master of variations on a theme), the nationalist superhero can change, within limits, over time. During his stint in World War II, Captain America killed plenty of people in combat with plenty of gusto and no qualms. It seems that he was frozen in a block of ice for a good part of the 1950s, but was thawed out somehow during the Johnson administration without lending his services to the Vietnam War effort. (He went in Indochina just a couple of times, to help out friends.) At one point, a writer was on the verge of turning the Captain into an overt pacifist, though the publisher soon put an end to that.

Even my very incomplete rendering of Dittmer’s ideas here will suggest that his analysis is a lot more flexible than Legman’s denunciation of the superhero genre. The book also makes more use of cross-cultural comparisons. Without reading it, I might never known that there was a Canadian superhero called Captain Canuck, much less the improbable fact that the name is not satirical.

But in the end, Legman and Dittmer share a sense of the genre as using barely conscious feelings and attitudes in more or less propagandistic ways. They echo the concerns of one of the 20th century's definitive issues: the role of the irrational in politics. And that doesn't seem likely to become any less of a problem any time soon.



Editorial Tags: 

Inaugural Poet Boosts Sales at U. of Pittsburgh Press

The University of Pittsburgh Press is printing new copies of two collections of poetry by Richard Blanco, the inaugural poet selected by President Obama, and the press is preparing to release a new volume, which will include the inaugural poem, The Pittsburgh Post-Gazette reported. Orders are coming in fast. The books currently available from Pitt are City of a Hundred Fires and Looking for the Gulf Motel.



Ad keywords: 

Review of Andrew Piper, "Book Was There"

Intellectual Affairs

A couple of months ago I interrupted several years of procrastination and finally got around to a time-consuming bit of housework: unpacking each volume from every shelf in my library, flipping it (the shelf that is), and then putting the books into a more orderly state than they had been in a long time. It was the work of several days. The shelves are thick and sturdy, but they had borne two rows of books, plus whatever could be fitted in horizontally, for more than a decade. With a dozen tiers to process -- at eight shelves per tier, and 25 to 50 volumes per shelf -- I had an incentive to build up all the mindless, robotic momentum possible. Stopping to read anything was strictly forbidden, for all the good that did.

They were, and still are, organized alphabetically by author’s name. Friends occasionally express dismay at this. It seems the most impersonal system possible short of arranging them by color. But putting them back on the shelf -- after sifting and sorting them, and a lot of dusting -- proved anything but impersonal. It was comparable to reading an old journal – that is, an experience of numbing repetitiveness, interrupted by melancholy and embarrassment. Several volumes have inscriptions from friends who have died. My copy of the collected Edgar Allan Poe was a Christmas present from pre-adolescence, when bookplates evidently struck me as the height of sophistication. What is stranger -- the extensive academic literature concerning UFO-based religions, or the fact that I seem to own all of it?

Memory kept sabotaging discipline. It’s amazing the job ever got done. The same cannot be said of winnowing through a couple of filing cabinets loaded with photocopies, a week or so later, which involved no more complex sentiment than a kind of satisfying ruthlessness. And with digital text, you don’t even get that. Every so often I copy all the e-books and article PDFs from the laptop to a flashdrive, which then gets dropped into a coffee cup on my desk, along with the others. Sorting and purging the e-library hardly seems worth the effort. Any item in it can be located and extracted within a few minutes. I have no fond memory of acquiring any of them. Downloading a book from Amazon must be consumerism at its most disenchanted. For that matter, thinking back on the e-books I’ve read, what comes to mind is almost always information, rather than the experience of reading them.

Andrew Piper’s Book Was There: Reading in Electronic Times (University of Chicago Press) occupies a niche somewhere between a couple of fields of study that were already interdisciplinary. One is the history of the book, from scroll to e-reader. The other is a phenomenological psychology of reading – an effort, that is, to describe the concrete experience of engaging with the written word, which involves more than the sense of sight, or even the neural processes that somehow convert squiggles into meaning.

“Books have been important to us,” Piper writes in a passage that made me glad to have read him, “because of the way our interactions with them span several domains of sensory and physical experience. Whether it is through the acts of touch, sight, sound, sharing, or acquiring a sense of place, [our] embodied, and at times impersonal, ways of interacting with books coalesce to magnify the learning that takes place through them. The same information processed in different ways and woven together is one of the profound secrets of bookish thought."

Piper, an associate professor of languages, literature, and culture at McGill University, in Montreal, won the Modern Language Association Prize for a first book Dreaming in Books: The Making of the Bibliographic Imagination in the Romantic Age (2009), also published by the University of Chicago Press; and his paper “Rethinking the Print Object: Goethe and the Book of Everything” (2006) received the Goethe Society of North America’s annual essay prize. While no less grounded in European cultural history than his earlier work, Book Was There (its title taken from Gertrude Stein) is more digressive and memoiristic. Parenthood supplements scholarship: one of his children learned to read as Piper was writing the book; his reflections on reading as an aspect of self-fashioning are at least partly grounded in family life.

His intent is not -- as the subtitle “Reading in Electronic Times” might suggest -- a screed against the e-text flood. Book Was There shows a wide knowledge of contemporary digital art and literature, and Piper makes a brief mention of his role in a collaborative project to create a computer model of the impact of The Sorrows of Young Werther on subsequent literature. Like anyone who has given the matter more than a soundbite’s worth of thought, he recognizes that the relationship between the cultural system now emerging and the previous thousand years of human civilization involves both continuities and disruptions, for both better and worse.

What Piper does insist on is the specificity of how we interact with text when incarnated as the artifact of the three-dimensional book. This begins with the hand, which navigates through a bound volume in a way distinct from the turning of a scroll or the button-punching we do on Kindle or Nook. (For one thing, the ancient and the digital format resemble each other at least as much as either does the codex or a book from the Gutenberg era.) One of Piper’s core ideas, radiating out in several directions throughout the book, is that the sense of touch creates “a form of redundancy” in the overall experience of reading, “enfolding more sensory information into what we see and therefore what we read.”

Someone who has lived closely with a given volume for a long time will have a sense of what Piper means. Even without bookmarks or notes, you often know how to look up something in it fairly quickly. The citation is also a location which you can (literally) feel your way to finding.

But there is more to it than that. “As early as the twelfth century,” Piper notes, “writers began drawing hands in the margins of their books to point to important passages. Such a device gradually passed into typescript and became a commonplace of printed books.” And in that regard it mimicked the book’s own role in pointing to specific aspects of the world – making them, in a hand-related analogy, “graspable.” (The implicit contrast here would be the link, which serves as another way to direct the reader’s eyes, but with the constant risk of diffusing attention instead of directing it.)

Handwriting is another mode of engagement with text -- one considerably less efficient than today’s cut-and-paste norm, as Piper acknowledges, but of value precisely for its slowness, which permits incorporation of meaning rather than the aggregation of content. He cites research showing a significant relationship between writing by hand and drawing:

“Early elementary school students who draw before they write tend to produce more words and more complex sentences than those who do not. And as historians of writing have shown, writing makes drawing more analytical. It allows for more complex visual structures and relations to emerge. As Goethe remarked, word and image, drawing and writing, are correlates that eternally search for one another. Handwriting is an integral means of their convergence.”

Cyberculture is all about convergence, if not in Goethean terms. It overloads the reader’s sensorium through every conceivable channel of communication (preferably at the same time) while bombarding us with invitations to respond to its messages, right away. “Interactivity is a constraint,” Piper writes, “not a freedom.” As if glossing his point, a satirical article in The Onion recently reported that Internet users had had enough. “Nobody needs to get my immediate take on everything I see online,” it quotes one woman as saying. “…. At best I’m just going to parrot back some loose approximation of what I’ve heard before, which will just prove that I never should have weighed in in the first place.”

Piper suggests that reading of the older sort provokes a kind of anxiety in the culture now because of its seeming isolation and inertia – so out of step with the drive for connectivity, quantifiable impact, and a rapid turnover in goods and services. But -- to continue his point -- the physical inertia tends to generate a much more intense internal dynamism, making for more complex and lasting patterns of meaning.

That sounds right. Given the limits of space, my acquisition of hardbacks and paperbacks must slow down; at this point, the ones on hand are saturated enough with significance to last the rest of my days. But the e-texts filling my coffee cup can accumulate as rapidly as ever. No shelf bends under the weight, and their imprint on my memory is like footprints in the snow.

Essay argues that Aaron Swartz was wrong

Stewart Brand is credited with coining the phrase "information wants to be free." In the wake of the suicide of 26-year-old cyber activist Aaron Swartz, we need to re-evaluate that assumption.

Brand, the former editor of The Whole Earth Catalog and a technology early adopter, is a living link between two great surges in what has been labeled "the culture of free": the 1967 Summer of Love and the Age of Information that went supernova in the late 1990s. Each period has stretched the definition of "free."

During the Summer of Love, the Diggers Collective tried to build a money-free enclave in San Francisco’s Haight-Ashbury district. They ran "free" soup kitchens, stores, clinics and concerts. Myth records this as a noble effort that ran aground; history reveals less lofty realities. "Free" was in the eye of the beholder. The Diggers accumulated much of the food, clothing, medicine, and electronic equipment it redistributed by shaking down local merchants like longhaired mob muscle. Local merchants viewed Digger "donations" as a cost of doing business analogous to lost revenue from shoplifting. Somebody paid for the goods; it just wasn’t the Diggers or their clients.

Move the clock forward. Aaron Swartz’s martyr status crystallizes as I type. As the legend grows, Swartz was a brilliant and idealistic young man who dropped out of Stanford and liberated information for the masses until swatted down by multinational corporations, elitist universities, and the government. Faced with the potential of spending decades behind bars for charges related to hacking into JSTOR, a depressed Swartz committed suicide. (In truth, as The Boston Globe has reported, a plea bargain was nearly in place for a four-to-six-month sentence.)

I am sorry that Swartz died, and couldn’t begin to say whether he was chronically depressed, or if his legal woes pushed him over the edge. I do assert, though, that he was no hero. The appropriate label is one he once proudly carried: hacker. Hacking, no matter how principled, is a form of theft.

It’s easy to trivialize what Swartz did because it was just a database of academic articles. I wonder if his supporters would have felt as charitable if he had "freed" bank deposits. His was not an innocent act. The Massachusetts Institute of Technology and the Commonwealth of Massachusetts took the not-unreasonable position that there is a considerable difference between downloading articles from free accounts registered with a university, and purloining 4.8 million documents by splicing into wiring accessed via unauthorized entry into a computer closet. That’s hacking in my book – the moral equivalent of diverting a bank teller with a small transaction whilst a partner ducks behind the counter and liberates the till.

Brand and his contemporaries often parse the definition of free. Taking down barriers and making data easier to exchange is “freeing” in that changing technology makes access broader and cheaper to deliver. Alas, many young people don’t distinguish between "freeing" and "free." Many of my undergrads think nearly all information should come at no cost – free online education, free movies, free music, free software, free video games…. Many justify this as Swartz did: that the value of ideas and culture is artificially inflated by info robber barons.

They’re happy to out the villains: entrenched university administrations, Hollywood producers, Netflix, the Big Three record labels, Amazon, Microsoft, Nintendo, Sega…. I recently had a student pulled from my class and arrested for illegal music downloading. He was considerably less worried than Swartz and pronounced, "I fundamentally don’t believe anyone should ever have to pay for music." This, mind you, after I shared tales of folk musicians and independent artists that can’t live by their art unless they can sell it.

Sorry, but this mentality is wrong. Equally misguided are those who, like Swartz before his death, seek to scuttle the Stop Online Piracy Act and the Protect Intellectual Property Act. Are these perfect bills? No. Do they protect big corporations, but do little to shelter the proverbial small fish? Yes. Do we need a larger political debate about the way in which conglomeration has stifled innovation and competition? Book me a front-row seat for that donnybrook. Are consumers of everything from music to access to academic articles being price gouged? Probably. But the immediate possibility of living in a world in which everything is literally free is as likely as the discovery of unicorns grazing on the Big Rock Candy Mountain.

Let’s turn to JSTOR, the object of Swartz’s most recent hijinks. (He was a repeat offender.) JSTOR isn’t popular among librarians seeking subscription money, or those called upon to pay for access to an article (which is almost no one with a university account who doesn’t rewire the network). Many wonder why money accrues to those whose only "creation" is to aggregate the labor of others, especially when some form of taxpayer money underwrote many of the articles. That’s a legitimate concern, but defending Swartz’s method elevates vigilantism above the rules of law and reason. More to the point, reckless "liberation" often does more harm than good.

JSTOR charges university libraries a king’s ransom for its services. Still, few libraries could subscribe to JSTOR’s 1,400 journals more cheaply. (Nor do many have the space to store the physical copies.) The institutional costs for top journals are pricey. Go to the Oxford University Website and you’ll find that very few can be secured for under $200 per volume, and several are over $2,000. One must ultimately confront a question ignored by the culture of free: Why does information cost so much?

Short answer: Because journals don’t grow on trees. It’s intoxicating to think that information can be figuratively and literally free, until one assembles an actual journal. I don’t care how you do it; it’s going to cost you.

I’m the associate editor of a very small journal in the academic pond. We still offer print journals, which entails thousands of dollars in printing and mailing costs for each issue. Fine, you say, print is dead. Produce an e-journal. Would that be "free?" Our editor is a full-time academic. She can only put in the hours needed to sift articles, farm them out for expert review, send accepted articles to copy editors, forward copy to a designer, and get the journal to subscribers because her university gives her a course reduction each semester. That’s real money; it costs her department thousands of dollars to replace her courses. Design, copy editing, and advertising fees must be paid, and a few small stipends are doled out. Without violating confidentiality I can attest that even a modest journal is expensive to produce. You can’t just give it away, because subscribers pick up the tab for everything that can’t be bartered.

Could you do this free online with no membership base? Sure – with a team of editors, designers, and Web gurus that don’t want to get paid for the countless hours they will devote to each issue. Do you believe enough in the culture of free to devote your life to uncompensated toil? (Careful: The Diggers don’t operate those free stores anymore.) By the way, if you want anyone to read your journal, you’ll give it to JSTOR or some other aggregator. Unless, of course, you can drum up lots of free advertising.

The way forward in the Age of Information begins with an honest assessment of the hidden costs within the culture of free. I suggest we retire the sexy-but-hollow phrase “information wants to be free" and resurrect this one: "There’s no such thing as a free lunch." And for hackers and info thieves, here’s one from my days as a social worker: "If you can’t do the time, don’t do the crime."

Rob Weir teaches history at Smith College. He is the author of Inside Higher Ed's "Instant Mentor" career advice column.

Editorial Tags: 


Subscribe to RSS - books
Back to Top