Submitted by Rob Weir on January 22, 2013 - 3:00am
Stewart Brand is credited with coining the phrase "information wants to be free." In the wake of the suicide of 26-year-old cyber activist Aaron Swartz, we need to re-evaluate that assumption.
Brand, the former editor of The Whole Earth Catalog and a technology early adopter, is a living link between two great surges in what has been labeled "the culture of free": the 1967 Summer of Love and the Age of Information that went supernova in the late 1990s. Each period has stretched the definition of "free."
During the Summer of Love, the Diggers Collective tried to build a money-free enclave in San Francisco’s Haight-Ashbury district. They ran "free" soup kitchens, stores, clinics and concerts. Myth records this as a noble effort that ran aground; history reveals less lofty realities. "Free" was in the eye of the beholder. The Diggers accumulated much of the food, clothing, medicine, and electronic equipment it redistributed by shaking down local merchants like longhaired mob muscle. Local merchants viewed Digger "donations" as a cost of doing business analogous to lost revenue from shoplifting. Somebody paid for the goods; it just wasn’t the Diggers or their clients.
Move the clock forward. Aaron Swartz’s martyr status crystallizes as I type. As the legend grows, Swartz was a brilliant and idealistic young man who dropped out of Stanford and liberated information for the masses until swatted down by multinational corporations, elitist universities, and the government. Faced with the potential of spending decades behind bars for charges related to hacking into JSTOR, a depressed Swartz committed suicide. (In truth, as The Boston Globe has reported, a plea bargain was nearly in place for a four-to-six-month sentence.)
I am sorry that Swartz died, and couldn’t begin to say whether he was chronically depressed, or if his legal woes pushed him over the edge. I do assert, though, that he was no hero. The appropriate label is one he once proudly carried: hacker. Hacking, no matter how principled, is a form of theft.
It’s easy to trivialize what Swartz did because it was just a database of academic articles. I wonder if his supporters would have felt as charitable if he had "freed" bank deposits. His was not an innocent act. The Massachusetts Institute of Technology and the Commonwealth of Massachusetts took the not-unreasonable position that there is a considerable difference between downloading articles from free accounts registered with a university, and purloining 4.8 million documents by splicing into wiring accessed via unauthorized entry into a computer closet. That’s hacking in my book – the moral equivalent of diverting a bank teller with a small transaction whilst a partner ducks behind the counter and liberates the till.
Brand and his contemporaries often parse the definition of free. Taking down barriers and making data easier to exchange is “freeing” in that changing technology makes access broader and cheaper to deliver. Alas, many young people don’t distinguish between "freeing" and "free." Many of my undergrads think nearly all information should come at no cost – free online education, free movies, free music, free software, free video games…. Many justify this as Swartz did: that the value of ideas and culture is artificially inflated by info robber barons.
They’re happy to out the villains: entrenched university administrations, Hollywood producers, Netflix, the Big Three record labels, Amazon, Microsoft, Nintendo, Sega…. I recently had a student pulled from my class and arrested for illegal music downloading. He was considerably less worried than Swartz and pronounced, "I fundamentally don’t believe anyone should ever have to pay for music." This, mind you, after I shared tales of folk musicians and independent artists that can’t live by their art unless they can sell it.
Sorry, but this mentality is wrong. Equally misguided are those who, like Swartz before his death, seek to scuttle the Stop Online Piracy Act and the Protect Intellectual Property Act. Are these perfect bills? No. Do they protect big corporations, but do little to shelter the proverbial small fish? Yes. Do we need a larger political debate about the way in which conglomeration has stifled innovation and competition? Book me a front-row seat for that donnybrook. Are consumers of everything from music to access to academic articles being price gouged? Probably. But the immediate possibility of living in a world in which everything is literally free is as likely as the discovery of unicorns grazing on the Big Rock Candy Mountain.
Let’s turn to JSTOR, the object of Swartz’s most recent hijinks. (He was a repeat offender.) JSTOR isn’t popular among librarians seeking subscription money, or those called upon to pay for access to an article (which is almost no one with a university account who doesn’t rewire the network). Many wonder why money accrues to those whose only "creation" is to aggregate the labor of others, especially when some form of taxpayer money underwrote many of the articles. That’s a legitimate concern, but defending Swartz’s method elevates vigilantism above the rules of law and reason. More to the point, reckless "liberation" often does more harm than good.
JSTOR charges university libraries a king’s ransom for its services. Still, few libraries could subscribe to JSTOR’s 1,400 journals more cheaply. (Nor do many have the space to store the physical copies.) The institutional costs for top journals are pricey. Go to the Oxford University Website and you’ll find that very few can be secured for under $200 per volume, and several are over $2,000. One must ultimately confront a question ignored by the culture of free: Why does information cost so much?
Short answer: Because journals don’t grow on trees. It’s intoxicating to think that information can be figuratively and literally free, until one assembles an actual journal. I don’t care how you do it; it’s going to cost you.
I’m the associate editor of a very small journal in the academic pond. We still offer print journals, which entails thousands of dollars in printing and mailing costs for each issue. Fine, you say, print is dead. Produce an e-journal. Would that be "free?" Our editor is a full-time academic. She can only put in the hours needed to sift articles, farm them out for expert review, send accepted articles to copy editors, forward copy to a designer, and get the journal to subscribers because her university gives her a course reduction each semester. That’s real money; it costs her department thousands of dollars to replace her courses. Design, copy editing, and advertising fees must be paid, and a few small stipends are doled out. Without violating confidentiality I can attest that even a modest journal is expensive to produce. You can’t just give it away, because subscribers pick up the tab for everything that can’t be bartered.
Could you do this free online with no membership base? Sure – with a team of editors, designers, and Web gurus that don’t want to get paid for the countless hours they will devote to each issue. Do you believe enough in the culture of free to devote your life to uncompensated toil? (Careful: The Diggers don’t operate those free stores anymore.) By the way, if you want anyone to read your journal, you’ll give it to JSTOR or some other aggregator. Unless, of course, you can drum up lots of free advertising.
The way forward in the Age of Information begins with an honest assessment of the hidden costs within the culture of free. I suggest we retire the sexy-but-hollow phrase “information wants to be free" and resurrect this one: "There’s no such thing as a free lunch." And for hackers and info thieves, here’s one from my days as a social worker: "If you can’t do the time, don’t do the crime."
Rob Weir teaches history at Smith College. He is the author of Inside Higher Ed's "Instant Mentor" career advice column.
Aaron Swartz committed suicide last week at the age of 26. I would like to pay tribute to him by writing calm, elegiac prose conveying something of his intelligence, his passion, and the distinctive quality of puckishness that photographs of him managed to capture surprisingly well.
Unfortunately it does not look like that is going to be possible. Things would need to make more sense than they have, so far. Feelings of sadness and anger, which are perfectly appropriate responses, keep giving way to the paradoxical and incoherent state of mind in which I both grasp what has happened and simultaneously think that it can’t really be true. This reached its worst and most absurd expression in the passing thought that news of his suicide might be part of a scheme in which Aaron is alive and well, living under a new identity someplace where U.S. government prosecutors will never find him.
It’s possible! Well, no, of course it isn’t. This state of mind is what they call “being in denial,” and it’s embarrassing to recognize. But it hardly seems more irrational than the reality in question. For the government’s prosecution of Aaron for hacking into the Massachusetts Institute of Technology's system to download a few million articles from scholarly journals was not just a case of intellectual-property law being enforced with too much zeal. It seems more like an expression of vindictiveness.
Consider something just reported by the Associated Press: “Andrew Good, a Boston attorney who represented Swartz in the case last year, said he told federal prosecutors in Massachusetts that Swartz was a suicide risk. 'Their response was, put him in jail, he’ll be safe there,' Good said." It is too hard to think about that. Better to imagine him escaping, carrying on his work in silence, cunning, and exile.
He was already something of a legend when we met for lunch not quite five years ago, having already been in touch for a couple of years. At the time, he was known for his role in the creation of RSS and Infogami; his internet-freedom activism and legal troubles were to come. Among his projects had been the online archive he created for Lingua Franca magazine, then defunct though still widely admired. I had been a contributing writer for LF and heard about Aaron from a couple of friends, and was very glad to be able to interview him about the Open Library cataloging initiative he was helping to launch.
Not that long before we were able to meet face-to-face, Aaron had given a talk called “How to Get a Job Like Mine” which covered his career up through the age of 20. In person, he was modest about his teenage coding career, or at least disinclined to say much about it, and I never got the feeling that his later exploits in taking on the Public Access to Court Electronic Records (PACER) database and JSTOR involved anything like hacker vainglory.
In his activism (legal and otherwise) as in his early coding projects, the emphasis was always squarely on making access to information and tools more widely available, on the grounds that restricting the flow of knowledge served only to make already-powerful people still more powerful. Aaron seemed earnest without being dour or humorless, which struck me as giving him one leg up on his hero Noam Chomsky.
While trying to pull these impressions together, I had a moment of seeing something about Aaron that never crossed my mind while he was alive, although it seems, with hindsight, pretty obvious: He was as perfect an embodiment of the mythological being known as the trickster as anyone could possibly be. My copy of Lewis Hyde’s brilliant book Trickster Makes This World: Mischief, Myth, and Art (1998) has gone missing, but the author’s website has a pertinent description.
Trickster figures in various cultures “are the consummate boundary-crossers, slipping through keyholes, breaching walls, subverting defense systems. Always out to satisfy their inordinate appetites, lying, cheating, and stealing, tricksters are a great bother to have around, but paradoxically they are also indispensable heroes of culture. In North America, Coyote taught the race how to catch salmon, sing, and shoot arrows. In West Africa, Eshu introduced the art of divination so that suffering humans might know the purposes of heaven. In Greece, Hermes the Thief invented the art of sacrifice, the trick of making fire, and even language itself.”
The gods and worldly authorities alike think of the trickster as a criminal, or at least a bad apple. Furthermore, tricksters tend to be prodigies -- their genius for invention and disruption already evident in childhood, if not infancy. In the introduction to his book, Hyde writes that the trickster’s disregard for the rules “isn’t so much to get away with something or to get rich as to disturb the established categories of truth and property and, by so doing, open the road to possible new worlds.”
That names Aaron’s attitude beautifully, and my fleeting daydream that he might somehow be pulling a fast one on the authorities is like something out of a trickster narrative. The resemblance also goes some way towards explaining why, more than anyone I've ever met, he seems destined to be remembered as a hero for a long time to come. You don't get to make that many friends who are archetypes, but Aaron was an exceptional person no matter how you look at him.
After a successful pilot, JSTOR is launching its Register & Read program, which lets anyone read up to three articles from 1,200 of its journals every two weeks in exchange for demographic information.
There's a mean streak at the heart of a certain kind of American optimism -- a rugged, go-it-alone, dog-eat-dog strain of individualism that is callous at best, shading into the sociopathic. It values independence, or says it does, but only by regarding dependency as a totally abject condition. The reality that illness or old age threw even the hardiest pioneer into reliance on others hardly factors into this worldview; the notion that civilization implies interdependence is, for it, almost literally unthinkable.
As I say, this outlook can manifest itself as optimism (the future is one of unbounded possibility, etc.) not always distinct from wishful thinking or denial. And it’s just as likely to pour out in resentment that is keen, if not particularly consistent. “I am a victim,” the logic goes, “of all those people out there playing victim.” Absent a frontier, the frontier spirit starts wallowing in self-pity.
The absence of pity of any sort from Kim E. Nielsen’s new book A Disability History of the United States, published by Beacon Press, is hardly the most provocative thing about it. Nielsen, a professor of disability studies at the University of Toledo, indicates that it is the first book “to create a wide-ranging chronological American history narrative told through the lives of people with disabilities.” By displacing the able-bodied, self-subsisting individual citizen as the basic unit (and implied beneficiary) of the American experience, she compels the reader to reconsider how we understand personal dignity, public life, and the common good.
Take the “ugly laws,” for instance. During the late 19th and early 20th centuries, major American cities made it illegal for (in the words of the San Francisco ordinance from 1867) “any person who is diseased, maimed, mutilated, or in any way deformed so as to be an unsightly or disgusting object” to appear in “streets, highways, thoroughfares, or public places.”
The laws were unequally enforced, with poor and indigent people with handicaps being the main targets. For one thing, the impact of the Civil War plus the incredible frequency of industrial accidents meant there were more unsightly beggars than ever. But while deformed and damaged bodies were being cleared from the streets, there was a pronounced public appetite for the exhibits at “freak” shows.
Now, the two phenomena in question have been studied in some depth over the years. A monograph on the ugly laws appeared not that long ago -- and while there have been more detailed studies of the world of “human oddities” than the late Leslie Fiedler’s cultural history Freaks: Myths and Images of the Secret Self (1978), I doubt many have been nearly as thought-provoking. Nielsen’s historical narrative is presumably meant for undergraduates and the general public, so it’s natural to lose nuance in the treatment of either topic. But the breadth of the survey also means there is a gain in perspective.
No direct link exists between the policing of disabled bodies and their exploitation as entertainment, yet there is a connection even so. “In contestations over who was fit to be present in the civic world and who was not,” Nielsen says, “people with disabilities often found themselves increasingly regulated. Those not considered fit for public life were variably shut away, gawked at, [or] exoticized.”
It was a far cry from the norm of a century earlier. “The general lack of discussion and institutional acknowledgement of physical disabilities” in 17th- and 18th-century America “suggests that they simply were not noteworthy among communities of European colonists in the period before the Revolution,” Nielsen writes. “Indeed, it suggests that such bodily variations were relatively routine and expected – and accommodations were made, or simply didn’t have to be made, to integrate individuals into community labor patterns.”
Over time, in other words, disability became abnormal. Or at least it quit seeming “normal” in the way that it once did: a hard fact of life, to be sure, but just in the nature of things. Consider the way severely wounded veterans of the Revolutionary War reintegrated into the life of the new Republic. Citing recent historical work, Nielsen indicates that they “labored, married, had children, and had households typical in size and structures, at rates nearly identical to their nondisabled counterparts." They “worked at the same types of jobs, in roughly the same proportions” as well, and as a group they experienced poverty at the same rates as others of their background. The wounded returning from later wars had a much harder time of it.
Not all handicaps are created equal, of course. Nor is it self-evident that they should be lumped together (war wounds and birth defects, blindness and retardation, mental illness and dwarfism) under the common heading of “disability.” Nielsen sketches the changing ways political and medical authorities responded to the afflicted -- by trying to help them, or hide them, or both. In any case, the trend was to define them not by what they could do, but by their handicap. At the same time, attitudes towards the disabled were becoming tangled together with other prejudices. If certain people weren’t allowed to vote or otherwise exercise much power, it was only because their race, gender, or foreign origin left them physically or mentally unfit for it. Stigma and inequality fed off one another.
The very idea of being profoundly, inescapably limited in some way makes for anxiety when the cultural norm is the expectation “to create successful and powerful selves” that are ready to “stand on our own two feet” and “speak for ourselves.” Nielsen points out that the last two figures of speech are part of the problem. There are people who literally can’t “stand on their own two feet” or “speak for themselves.” While my exposure to the kinds of disability activists Nielsen writes about in the final pages of her history has been limited, they do seem to have an ironic and sarcastic (rather than po-facedly indignant) response to such "able-ist" imagery -- regarding it less as an insult than as evidence that the speaker is a bit thick. Which is usually true. The "unchallenged," as we might be called euphemistically, tend to be somewhat lacking in imagination and insight about their struggle for greater equality and autonomy.
And yet they have won some battles – by demanding help. By demanding a redistribution of resources on the basis of their intrinsic right, as human beings, to the dignity they could not enjoy otherwise. Someone in a wheelchair can zip around the neighborhood just fine, getting to her job at the pharmacy on time, provided the curbs are made accessible. And no, the person in the wheelchair is not responsible for paying for that, any more than her customers are responsible for mixing their own medications. Interdependence is not a failure of independence; it is the condition for enjoying the sort of independence that means anything at all.