At 30 years old, I definitely consider myself part of the Facebook generation. Zuckerberg’s brainchild hit the ‘net when I was a senior in college, and by then I was already well acquainted with e-mail, chat rooms, text-messaging, and all the multifarious precursors to today’s social media. I text, I post, I chat, I even snapchat: in these respects, I’m an utterly unremarkable member of my society.
But I also happen to be a college professor and a molder of young minds. And, far from indulging the technology-driven spirit of the times, I make my students work as students have always worked. They read Seneca, Pascal, Tolstoy, and Schopenhauer. They are obliged to turn in papers by hand; they must come to office hours to speak with me about their grades; they are even, and this is most anachronistic of all, required to attend class. Physical presence is key to every aspect of their learning experience, be it my hovering, breathing presence in the classroom or the office, the cohort of 30 or so warm bodies that shows up for lecture twice a week, or the more abstract form of embodiment conveyed by the weight of a book.
To believe certain commentators, however, this embodied notion of learning is on its way out in American higher education. Writing for The American Interest’s January/February 2013 edition, the recent Yale graduate Nathan Harden offers the following ominous prognostications about the future of university instruction in our digital age:
In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.
On Harden’s account, one of the principal reasons for this portended transformation, which is already being partially implemented by such institutions as Harvard and MIT, is that the cost of college is increasingly out of proportion with its perceived economic benefit. As the American job market has become more competitive, the cost of a degree has increased, and only the most naïve of students still believe that a college education is a universally redeemable ticket to middle-class prosperity. The weighing up of costs and benefits involved in earning a college degree will lead inevitably to a re-evaluation of the current higher education model. Luxury residence halls, face-to-face interaction between professors and students, ivied brick walls -- these will all be things of the past once the much-heralded education bubble finally bursts. What will replace them are massively populated, inexpensive online courses and lectures, prerecorded by the very best lecturers and administered by those hordes of professors and other academics not quite sexy or charismatic enough to warrant virtual celebrity.
To anyone who thinks Harden’s predictions are a little too ambitious (not to mention deeply disturbing, at least for college professors who don’t fancy the idea of working in a grading factory), don’t worry -- they most likely are. What Harden forgets -- and indeed, what just about everyone prophesying the eclipse of face-to-face interaction in a virtual world forgets -- is that human beings are, above all else, bodies, and that to lead full, happy, and meaningful lives, we need other bodies. Let’s consider the following examples of how technologies of virtualization have failed to triumph over our species’ thirst for physical presence.
1. The Giant Head. Some older readers may recall a famous article in Reader’s Digest from the late 1950s featuring an illustration of a massive human head connected to miniscule arms and legs. What was the thesis of that article? The tech junkies of the time believed that in the future technology would become so advanced that human beings would no longer need to use their bodies, leading to a swelling of the brain and a shriveling of our appendages. Many also foretold a time when food supplements would replace food. Wouldn’t it be great, they asked, if instead of spending hours preparing and eating meals, we could nourish ourselves in just a few seconds? No one at the time seemed to consider that human beings might not want to do any of this — that we might enjoy using our bodies, eating, and the like. In the half-century since these predictions were made, restaurants have proliferated, and heads haven’t grown one bit.
2. Live Theater. When I was a kid, there were hardly any live theaters in my hometown of Bakersfield, Calif. Now there are about ten. Many people used to believe that movies had sounded the death knell for live theater, but today the latter enjoys just as much, if not more, prestige than it did 100 years ago. I recently had the good fortune to see Kevin Spacey’s production of Richard III. I’ll remember his performance for the rest of my life — it had never occurred to me that acting could be so visceral, so violent, so physical. How many of us can say the same thing about movies? Again, those who foretold the demise of live theater never reckoned that people might just plain like seeing living bodies move around and speak on the stage, and that no amount of special effects could compensate for the lack of real flesh and blood.
3. The myth of social media. This myth holds that virtual, online or technologically mediated interactions are in the process of replacing face-to-face interactions. Most people never take the time to think about what the world would be like if this were really the case. I live in a small college town, and I can assure anyone interested in such things that student interactions on Friday and Saturday nights are plenty physical —sometimes I can hear them from across the lake! Social media does little more than provide a way of sharing information that enhances the intimacy of eventual physical contact. Anyone who doesn’t know this doesn’t understand the technology.
Of course, people like Harden will point to other sectors of the economy where technological innovation has erased thousands of jobs. People don’t need information from stockbrokers or travel agents to make decent decisions about travel or investment anymore, so why should a living, breathing professor be necessary to convey the sort of information one gets out of a college education? If that information can be distributed more cheaply thanks to virtualization, why should students be expected to bear the extra expense of classroom education?
The answer to this question is so elementary that the objection supporting it is almost hard to take seriously. The truth is that education is not simply the conveying of information. In fact, it is probably only marginally that. How many people remember most of what they learned in college? Only very few, I would guess. The benefit of a classroom education is that it keeps students under a certain amount of mental pressure, forces them to think on the spot, and obliges them to explain themselves to other people who are physically present. Information is afoot in these interactions, but so are wisdom, passion, empathy, and a whole host of other viscera that only an embodied teacher or student can properly convey.
How effective, for instance, do we imagine an online church experience would be compared to the real thing? Is it reasonable to think that a virtual tour of the cathedral at Chartres would be as spiritually moving as being there? We should also consider that many students might simply enjoy the physical classroom and their interaction with peers and professors -- or at least they might recognize that they learn better under these conditions. The costs of classroom education may be soaring out of proportion at present, but this is not a verdict on the education itself.
So let’s ask -- what developments are behind these grim augurs of the collapse of America’s higher education model? Some of it undoubtedly has to do with politics. Many commentators on the right (and perhaps Harden is one of them) would likely cheer the dismantlement of a system whose values are often perceived as far left of center. If taking education online can put “tenured radicals” out of work, then why not welcome it? At the same time, however, just as many moderate and left-leaning thinkers have joined the chorus of those predicting the failure of higher education (for instance, see Thomas Friedman’s recent writings in The New York Times), and it would be simplistic to chalk this latest round of doom-peddling up to politics.
The real culprit, I suggest, is what, for lack of a better term, we might call Appleism. Innocent in principle but nefarious in practice, the doctrine of Appleism holds that increases in technological capability are synonymous with increases in human happiness. Anything that can be put on a screen is better than what can be seen with the naked eye. The passage of electrons through a cathode tube is equivalent to passage from a lower to a higher state of being. Proponents of Appleism hold out technology as an intrinsic good; they are the sorts of folks who compulsively buy the latest Apple product, simply on principle.
We can point to fiscal insolvency all we want, but one has difficulty believing that Harden’s and others’ vision of a fully or almost-fully online education is not also the product of society’s limitless fascination with virtualization. Proponents of the current craze ought to think carefully about the human costs of technology before enthusiastically proclaiming the end of a system that could leave hundreds of thousands of people without work, students cheated out of a quality education, and that would further contribute to the creation of a world where virtualization is always and everywhere, without qualification or questioning, heralded as an unequivocal good.
Louis Betty is an assistant professor of French at the University of Wisconsin-Whitewater.
A bunch of educators, several of whom I know and respect quite a bit, got together last month to write a "bill of rights" for online learners. Viewable and editable here.
They included the rights to access, privacy, openness, to create public knowledge, to "pedagogical transparency" (to understand the ways you are being taught and the value of any credentials offered), "financial transparency" (Where is my tuition money going? How will this “free course” be paid for?), to have great teachers, and to become teachers.
I can’t find myself disagreeing with anything much that they had to say, except for one screaming contradiction that brings the whole thing down.
"All too often, during such wrenching transitions, the voice of the learner gets muffled," this group wrote in their introduction.
The problem is, this group didn't include any learners. Of the 12 signatories, I count 8 Ph.D.s or Ph.D. equivalents. They didn’t reach out to any learners on public forums. They didn’t ask any learners what they wanted to put in the document. The voice of learners is absolutely silent.
Sure, we’re all lifelong and informal learners in some sense, but let’s draw a real distinction here. Let’s talk about people who don’t have a bachelor’s degree and need one or the equivalent to make a decent living and participate in society on an equal footing. I’m not asking why the group didn’t poll Udacity users in Pakistan or Colombia, or YouMedia high school students in Chicago, or middle schoolers around the globe making their way through Khan Academy math videos, and find out exactly what their concerns are and how they would prefer to have them represented in such a document. Although really, it wouldn’t have taken much time or many resources to do this kind of research. I’m asking why they wrote a “learners’ bill of rights” without including one actual learner in their little group of 12.
I’m not going to be tendentious and draw parallels with other bills of rights. I’m not going to ask about the advisability of men writing a feminist Bill of Rights on behalf of the women they care about so deeply. Or of the North writing a bill of rights for Southerners after the Civil War. Or of employers writing a bill of rights for their employees.
Suffice it to say that educators are in a historical position of no small authority over learners. And when one group of people with authority over another makes up the rights for the second group, they tend to get some things wrong.
The fact is, this isn’t a bill of rights for learners at all. It’s a set of principles to support the interests of a group of educators, who share concern for learners, blended with concern for their own group. They tip their hand in the eighth principle, “The right to have great teachers.”
“Students should expect -- indeed demand -- that the people arranging, mentoring and facilitating their learning online be financially, intellectually and pedagogically valued and supported by institutions of higher learning and by society. Teachers’ know-how and working conditions are students’ learning conditions.”
I am in favor of all who work with learners being fairly paid, and I am definitely in favor of great teachers. But I am not in favor of students being drafted onto the metaphorical or actual picket lines. Students in state four-year institutions are paying more and more of the salaries of their instructors and going into sometimes-extreme debt to do it. There’s an uncomfortable moment where the interests of the learners actually diverge from the interests of the career academics, and it should be discussed openly.
But enough. The authors intended this to be a living document, and I respect that there’s time to revise and collect comments from the hundreds of thousands of online learners out there. It’s not going to be that difficult.
When I first found out about this bill of rights, I posted it to OpenStudy, the online learning community. I got this response from an undergraduate computer science major within 45 minutes, which reads in part:
“you deserve education BASED ON WHAT YOU WANT TO DO IN LIFE..
Teach kids real world problems, and have them enjoy it…
Teachers/professors who care. In my time I have met a lot of wonderful professors, mentors, teachers, coaches, and a ton of HORRIBLE ones…
The job market sucks, and with students being taught the same thing, and not really learning what they wish it's hard to distinguish someone from the rest of the pack. If we want to succeed we need to produce students who enjoy learning, and have the tools to learn what THEY WANT TO LEARN."
Another wrote: "The rights I want in the ever-growing digital era are not anything different than what I would want outside of it. We have to expand these rights to be applicable into the digital world."
That’s a good start. Now there’s time to come up with a set of amendments -- a real learners’ bill of rights.
The Massachusetts Institute of Technology professor leading its inquiry into whether it inappropriately handled the federal prosecution of Aaron Swartz has provided some details on the investigation. In an open letter published in The Tech, MIT's student newspaper, Hal Abelson pledged a full and open inquiry, and said that the issues were extremely important. "This matter is urgently serious for MIT," Abelson wrote. "The world respects us not only for our scholarship and our science, but because we are an institution whose actions are and always have been guided by the highest ideals and the most thoughtful judgment. Our commitment to those ideals is now coming into question. At last Saturday’s memorial, Aaron’s partner Taren Stinebrickner-Kauffman described his mental state: 'He faced indifference from MIT, an institution that could have protected him with a single public statement and refused to do so, in defiance of all of its own most cherished principles.'"
Abelson also announced the creation of a website on which MIT students and faculty members can suggest questions that the review should consider. The site can be viewed by people without MIT affiliations, but they may not contribute.
One potentially positive result of the current fascination with online education is that universities and colleges may be forced to define and defend quality education. This analysis of what we value should help us to present to the public the importance of higher education in a high-tech world. However, the worst thing to do is to equate university education with its worst forms of instruction, which will in turn open the door for distance learning.
Perhaps the most destructive aspect of higher education is the use of large lecture classes. Not only does this type of learning environment tend to focus on students memorizing information for multiple-choice tests, but it also undermines any real distinction between in-person and online education. As one educational committee at the University of California at Los Angeles argued, we should just move most of our introductory courses online because they are already highly impersonal and ineffective. In opposition to this argument, we need to define and defend high-quality in-person classes.
Although some would argue that we should prepare students for the new high-tech world of self-instruction, we still need to teach students how to focus, concentrate, and sustain attention. In large classes, where the teacher often does not even know if the students are in attendance, it is hard to get students to stay on task, and many times, these potential learners are simply surfing the web or text messaging. In a small class, it is much harder for students to be invisible and to multitask, and while some may say that it is not the role of university educators to socialize these young adults, it is clear that the current generation of students does need some type of guidance in how they use technology and participate in their own education.
When people multitask, it often takes them twice as long to complete a task, and they do it half as well. For instance, my students tell me that when they try to write a paper, they are constantly text messaging and surfing the web: the result is that they spend hours writing their essays, and their writing is often disjointed and lacking in coherence. Since they are not focused on a single task, they do not notice that the ideas and sentences in their essays do not flow or cohere. Literally and figuratively, these multitasking students are only partially present when they are writing and thinking.
This lack of presence also shows up in the classroom. Students often act as if they are invisible in small classes because in their large lecture classes they are in many ways not present. Many students seem to lack any awareness of how they appear to others, and they are so used to sleeping in their large classes that they do not think about how their present absence appears to other students in a smaller class. Of course, it is much more difficult for students to be either literally or figuratively absent in a small class, but some students have been socialized by their large lecture classes to ignore the different expectations of more intimate learning environments.
As many higher education teachers have experienced, some students are able to participate in online discussion forums but have a hard time speaking in their small seminars. Once again, students may find it difficult being present in front of others and taking the risk of presenting their own ideas in the presence of others. Some distance educators argue that we can resolve this problem by just moving classes online, but do we really want to train a generation of students who do not know how to communicate to other people in a natural setting?
I worry that students are losing the ability to make eye contact and read body language, and that they are not being prepared to be effective citizens, workers, and family members. This disconnect from in-person communication also relates to a distance from the natural world, and a growing indifference to the destruction of our environment. In this alienation from nature and natural environments, people, also lose the ability to distinguish between true and false representations. Since on the web, everything is a virtual image or simulation generated by digital code, we live in a state of constant in-difference.
The web also creates the illusion that all information is available and accessible to anyone at any time. This common view represses the real disparities of access in our world and also undermines the need for educational experts. After all, if you can get all knowledge from Wikipedia or a Google search, why do you need teachers or even colleges? In response to this attitude, we should recenter higher education away from the learning of isolated facts and theories and concentrate on teaching students how to do things with information. In other words, students need to be taught by expert educators about how to access, analyze, criticize, synthesize, and communicate knowledge from multiple perspectives and disciplines.
While online educators argue that the traditional methods of instruction I have been discussing are outdated because they do not take into account the ways the new digital youth learn and think, I would counter that there is still a great need to teach students how to focus, concentrate, and discover how to make sense of the information that surrounds them. Too many online enthusiasts sell the new generation of students short by arguing that they can only learn if they are being entertained or if learning is an exciting, self-paced activity. Yet, we still need to teach people to concentrate and sustain their attention when things may get a little boring or difficult. Not all education should be fast-paced and visually stimulating; rather, people have to learn how to focus and stick with difficult and challenging tasks.
In this age of distracted living, where people crash their cars while text messaging and parents ignore their children while multitasking, do we really want a generation of students to take college classes on their laptops as they text, play games, and check their Facebook status updates? Isn’t there something to value about showing up to a class at the right time and the right place with the proper preparation and motivation? The idea of anytime, anyplace education defeats the purpose of having a community of scholars engaged in a shared learning experience. Furthermore, the stress on self-paced learning undermines the value of the social nature of education; the end result is that not only are students studying and bowling alone, but they are being seduced by a libertarian ideology that tells them that only the individual matters, and there is no such thing as a public space anymore.
When students have to be in a class and listen to their teacher and fellow learners, they are forced to turn off their cell phones and focus on a shared experience without the constant need to check their Facebook pages or latest texts. This experience represents one of the only reprieves young people will have from their constantly connected lives. In fact, students have told me that they would hate to take their classes online because they already feel addicted to their technologies. From their perspective, moving required classes online is like giving free crack to addicts and telling them that it will be good for them.
In order to help my students understand their dependence on technology and their alienation from nature and their own selves, I often bring them outside and tell them that they cannot do anything. This exercise often makes students very anxious, and when I later have students free-write about the experience, they write that they are not used to just doing nothing, and they felt an intense need to reach for their phones: this dependence on communication technologies will only be enhanced by moving to distance education.
Online education then not only adds to our culture of distracted multitasking, but it also often functions to undermine the values of university professors. In the rhetoric of student-centered education, the teacher is reduced to being a "guide on the side," and this downgraded position entails that there is no need to give this facilitator tenure or a stable position; instead, through peer grading and computer assisted assessment, the role of the teachers is being eliminated, and so it is little wonder that colleges operating only online employ most of their faculty off the tenure track.
These online colleges and universities have also separated teaching from research and have basically “unbundled” the traditional role of the faculty member. Like the undermining of newspapers by new media, we now have more sources of information but fewer people being paid to do the actual on the ground work of researching and reporting. Also as Wikipedia has turned every amateur into a potential expert, our society is losing the value of expert, credentialed educators. Although some see this as a democratization of instruction and research, it can also be read as a destruction of the academic business model and a move to make people work for free as traditional jobs are downsized and outsourced.
Many online programs proclaim that education is democratized by having students grade each other’s work, but isn’t this confusion between the roles of the student and the teachers just a way of rationalizing the elimination of the professor? Moreover, the use of computer programs to assess student learning is only possible if people think that education is solely about rote memorization and standardization. Yes we can use computers to grade students, but only if we think of students as standardized computer programs.
In contrast to massive open online courses, small, in-person classes often force students to encounter new and different perspectives, and the students cannot simply turn off the computer or switch the channel. Unfortunately, too many colleges and universities rely too much on large lecture courses that allow students to tune out during class and then teach themselves the material outside of class. While I am all for flipping the class and having students learn the course content outside of the classroom, we still need to use actual class time to help students to engage in research in a critical and creative fashion.
This push for small interactive classes will be resisted by the claim that it is simply too expensive to teach every student in this type of learning environment. However, my research shows that it is often more expensive to teach students in large lecture classes than in small seminars once you take into account the full cost of having graduate assistants teach the small sections attached to the large classes. Furthermore, the direct cost of hiring faculty to teach courses is often a fraction of the total cost of instruction, and massive savings could be generated if higher education institutions focused on their core missions and not the expensive areas of sponsored research, athletics, administration, and professional education. Being present at the university means that students and teachers are present in their classes and that education is the central presence of the institution.
(Illustration by Giulia Forsythe, licensed under Creative Commons Attribution-ShareAlike 2.5 agreement)
Submitted by Rob Weir on January 22, 2013 - 3:00am
Stewart Brand is credited with coining the phrase "information wants to be free." In the wake of the suicide of 26-year-old cyber activist Aaron Swartz, we need to re-evaluate that assumption.
Brand, the former editor of The Whole Earth Catalog and a technology early adopter, is a living link between two great surges in what has been labeled "the culture of free": the 1967 Summer of Love and the Age of Information that went supernova in the late 1990s. Each period has stretched the definition of "free."
During the Summer of Love, the Diggers Collective tried to build a money-free enclave in San Francisco’s Haight-Ashbury district. They ran "free" soup kitchens, stores, clinics and concerts. Myth records this as a noble effort that ran aground; history reveals less lofty realities. "Free" was in the eye of the beholder. The Diggers accumulated much of the food, clothing, medicine, and electronic equipment it redistributed by shaking down local merchants like longhaired mob muscle. Local merchants viewed Digger "donations" as a cost of doing business analogous to lost revenue from shoplifting. Somebody paid for the goods; it just wasn’t the Diggers or their clients.
Move the clock forward. Aaron Swartz’s martyr status crystallizes as I type. As the legend grows, Swartz was a brilliant and idealistic young man who dropped out of Stanford and liberated information for the masses until swatted down by multinational corporations, elitist universities, and the government. Faced with the potential of spending decades behind bars for charges related to hacking into JSTOR, a depressed Swartz committed suicide. (In truth, as The Boston Globe has reported, a plea bargain was nearly in place for a four-to-six-month sentence.)
I am sorry that Swartz died, and couldn’t begin to say whether he was chronically depressed, or if his legal woes pushed him over the edge. I do assert, though, that he was no hero. The appropriate label is one he once proudly carried: hacker. Hacking, no matter how principled, is a form of theft.
It’s easy to trivialize what Swartz did because it was just a database of academic articles. I wonder if his supporters would have felt as charitable if he had "freed" bank deposits. His was not an innocent act. The Massachusetts Institute of Technology and the Commonwealth of Massachusetts took the not-unreasonable position that there is a considerable difference between downloading articles from free accounts registered with a university, and purloining 4.8 million documents by splicing into wiring accessed via unauthorized entry into a computer closet. That’s hacking in my book – the moral equivalent of diverting a bank teller with a small transaction whilst a partner ducks behind the counter and liberates the till.
Brand and his contemporaries often parse the definition of free. Taking down barriers and making data easier to exchange is “freeing” in that changing technology makes access broader and cheaper to deliver. Alas, many young people don’t distinguish between "freeing" and "free." Many of my undergrads think nearly all information should come at no cost – free online education, free movies, free music, free software, free video games…. Many justify this as Swartz did: that the value of ideas and culture is artificially inflated by info robber barons.
They’re happy to out the villains: entrenched university administrations, Hollywood producers, Netflix, the Big Three record labels, Amazon, Microsoft, Nintendo, Sega…. I recently had a student pulled from my class and arrested for illegal music downloading. He was considerably less worried than Swartz and pronounced, "I fundamentally don’t believe anyone should ever have to pay for music." This, mind you, after I shared tales of folk musicians and independent artists that can’t live by their art unless they can sell it.
Sorry, but this mentality is wrong. Equally misguided are those who, like Swartz before his death, seek to scuttle the Stop Online Piracy Act and the Protect Intellectual Property Act. Are these perfect bills? No. Do they protect big corporations, but do little to shelter the proverbial small fish? Yes. Do we need a larger political debate about the way in which conglomeration has stifled innovation and competition? Book me a front-row seat for that donnybrook. Are consumers of everything from music to access to academic articles being price gouged? Probably. But the immediate possibility of living in a world in which everything is literally free is as likely as the discovery of unicorns grazing on the Big Rock Candy Mountain.
Let’s turn to JSTOR, the object of Swartz’s most recent hijinks. (He was a repeat offender.) JSTOR isn’t popular among librarians seeking subscription money, or those called upon to pay for access to an article (which is almost no one with a university account who doesn’t rewire the network). Many wonder why money accrues to those whose only "creation" is to aggregate the labor of others, especially when some form of taxpayer money underwrote many of the articles. That’s a legitimate concern, but defending Swartz’s method elevates vigilantism above the rules of law and reason. More to the point, reckless "liberation" often does more harm than good.
JSTOR charges university libraries a king’s ransom for its services. Still, few libraries could subscribe to JSTOR’s 1,400 journals more cheaply. (Nor do many have the space to store the physical copies.) The institutional costs for top journals are pricey. Go to the Oxford University Website and you’ll find that very few can be secured for under $200 per volume, and several are over $2,000. One must ultimately confront a question ignored by the culture of free: Why does information cost so much?
Short answer: Because journals don’t grow on trees. It’s intoxicating to think that information can be figuratively and literally free, until one assembles an actual journal. I don’t care how you do it; it’s going to cost you.
I’m the associate editor of a very small journal in the academic pond. We still offer print journals, which entails thousands of dollars in printing and mailing costs for each issue. Fine, you say, print is dead. Produce an e-journal. Would that be "free?" Our editor is a full-time academic. She can only put in the hours needed to sift articles, farm them out for expert review, send accepted articles to copy editors, forward copy to a designer, and get the journal to subscribers because her university gives her a course reduction each semester. That’s real money; it costs her department thousands of dollars to replace her courses. Design, copy editing, and advertising fees must be paid, and a few small stipends are doled out. Without violating confidentiality I can attest that even a modest journal is expensive to produce. You can’t just give it away, because subscribers pick up the tab for everything that can’t be bartered.
Could you do this free online with no membership base? Sure – with a team of editors, designers, and Web gurus that don’t want to get paid for the countless hours they will devote to each issue. Do you believe enough in the culture of free to devote your life to uncompensated toil? (Careful: The Diggers don’t operate those free stores anymore.) By the way, if you want anyone to read your journal, you’ll give it to JSTOR or some other aggregator. Unless, of course, you can drum up lots of free advertising.
The way forward in the Age of Information begins with an honest assessment of the hidden costs within the culture of free. I suggest we retire the sexy-but-hollow phrase “information wants to be free" and resurrect this one: "There’s no such thing as a free lunch." And for hackers and info thieves, here’s one from my days as a social worker: "If you can’t do the time, don’t do the crime."
Rob Weir teaches history at Smith College. He is the author of Inside Higher Ed's "Instant Mentor" career advice column.
Michael Barera has been named Wikipedian in residence at the Gerald R. Ford Presidential Library at the University of Michigan -- the first such position at a presidential library. Barera will focus on expanding the availability of information about President Ford and the library's holdings on Wikipedia through the Gerald Ford WikiProject.