“Before the Freedom of Information Act,” Henry Kissinger told a gathering of diplomats in Turkey in March 1975, “I used to say at meetings, ‘The illegal we do immediately; the unconstitutional takes a little longer.’ But since the Freedom of Information Act, I'm afraid to say things like that.”
Not that afraid, obviously. The Machiavellian quip got a laugh at the time, according to the official transcript -- and clearly it merits a spot in any future collection of familiar quotations, alongside Kissinger’s remark about power being the ultimate aphrodisiac. For now, it serves as the epigraph to a press release from WikiLeaks announcing the opening of the Public Library of U.S. Diplomacy, with its first collection consisting of more than 1.7 million diplomatic cables from 1973 to ’76.
All of the material was routinely (if belatedly) declassified after 25 years, per U.S. law, and has been available from the National Archives and Records Administration. WikiLeaks made the collection searchable and is “housing” it on servers presumably beyond the reach of Big Brother. Now they can’t be reclassified.
As announcements from WikiLeaks go, it’s all fairly underwhelming. But it does make an important revelation -- however unintentional -- by reminding the public that three years have passed since the group last made a world-shaking release of information. The leaks, it seems, have been plugged. Secret documents are staying secret. Even the most ardent admirer of Bradley Manning will be understandably reluctant to share his fate. While it is too soon to pronounce WikiLeaks dead, it does appear to be in a coma.
Castronovo, a professor of English and American studies at the University of Wisconsin at Madison, links the “Cablegate” of 2010 to a Revolutionary War-era incident through the concept of “a new kind of network actor” distinct from “the traditional person of liberal democracy.” The case in question was the Thomas Hutchinson affair of 1773, when letters by the governor of the Massachusetts Bay Colony somehow found their way into the hands of the Sons of Liberty, who then circulated them via newspaper and pamphlet.
Hutchinson had borne the brunt of serving His Majesty during the Stamp Act riots a few years earlier, and was in office during the Boston Massacre. In his correspondence he referred to the need for “abridgement of what are called English liberties" among the unruly colonial subjects, which was just so much gasoline on the fire.
The source of the leak was one Benjamin Franklin, colonial postmaster. Franklin later insisted that this ethical lapse was committed in an attempt (alas! unsuccessful) to reduce American hostility towards Parliament and the Crown by documenting that the real source of trouble was someone much lower in the chain of command. Castronovo treats this claim with greater suspicion than have some historians -- and not just because Franklin was such a master of irony, pseudonymous commentary, and the fake-out.
Franklin was also a node in multiple correspondence networks, and understood perfectly well how porous they could be. Alongside the official channels of communication between Court and colony, there were informal but durable long-distance connections among merchants, officials, publishers, and so on. A letter by someone within such a network tended to have, so to speak, an implicit “cc” or “bcc” field.
“More significant than the sending and receipt of private letters between individuals,” writes Castronovo, the activity of these epistolary networks “encompassed a range of public activities, including the recitation of letters aloud, the printing of handwritten letters in newspapers, the transmission of pamphlets, and the sending of circular letters by local governments....” Such communications might be “opened by third parties and forwarded without permission, shared in social circles and reprinted in newspapers.”
By transmitting Hutchinson’s letters to figures within his own circles who were in contact with the more hot-headed American revolutionary circles, Franklin was creating a political weapon against the authorities. He was, in effect, both a whistleblower and Julian Assange at the same time.
Having put it that way, however, I must immediately backtrack to say that the analogy is not Castronovo’s point at all. “At issue,” he writes, “is how communication spreads and metastasizes, how ideas proliferate and take root, how views and opinions propagate themselves.”
The network in each case – epistolary or digital – is not just a medium or tool that individuals use to communicate or act. In it, rather, “individual agency becomes unmoored from stable locations and is set adrift along an interconnected web of tendril-like links and nodes.” This is a perspective derived from the work of Bruno Latour, among others. It rejects the familiar way of thinking of society as consisting of distinct individuals who interact and so create networks. Instead -- to put things one way – it’s networks all the way down. Society emerges from a teeming array of networks that overlap and intersect, that get knotted together or fray with use.
Franklin’s catalytic intervention in the American crisis of 1773 was as effective as it was by virtue of his ability to channel communication from one network to another. And it was effective because it was done quietly; he advanced the revolutionary process involving “a public interlinked and excited by expressions of dissent” without making himself known. “In a perhaps uncharacteristic move,” Castronovo says, “Franklin refuses to occupy the center [of public discussion], instead preferring to sit back in the shadows where, after all, the shadowy work of espionage gets done.”
But the state – however much it may use networks of its own – insists on ascribing public action to individuals possessing stable and legible identities. By 1774, the Privy Council knew about Franklin’s role in the matter and summoned him to a hearing in London, where he was denounced, in humiliating terms, for more than an hour.
Bradley Manning, of course, faces worse – while the coiner of that witticism about operating illegally and unconstitutionally has never endured the consequences of his actions. What does that imply for a Latourian theory of social ontology? I don’t know, but it surely demonstrates that not all networks are equal before the law.
Whether we're slaving over a scholarly article or a textbook, or knocking off streams of memos and e-mails, virtually all of us write constantly -- and we can do it better and more meaningfully, Mike Rose argues.
With so much focus on higher education's obligations to job preparation, the humanities are perpetually playing defense, especially in public higher education. We academic defenders of the humanities generally take one of two lines: we argue that 1) our majors ARE work force preparation -- we develop strong analytical skills, good writing, problem-solving, etc., or 2) we have no need to justify what we teach because the value of the humanities, the study of what makes us human, is self-evident.
These arguments over the value of degrees in the humanities run parallel to a set of arguments I find myself making as part of a role I occupy, as a board member for my state council for the humanities. The National Endowment for the Humanities allocates about a third of its funding through the state councils, and the councils in turn fund humanities initiatives at the state level.
State humanities councils such as mine (Rhode Island's) re-grant our NEH allocation as well as the money we raise locally to community humanities projects. We've funded research on communities of Cape Verdean longshoremen in Providence, oral histories of Second World War vets in hospice care, talk-back events at local theaters, seashore sound archives, a documentary film about a female 19th-century life-saving lighthouse-keeper, and lots of fascinating digital work, from archiving to app development. All the projects must involve humanities scholars — some of those scholars are affiliated with universities, and others aren’t. All of it aims at helping Rhode Islanders to understand ourselves, our histories, and our many cultures.
When economic times are tough, an agency such as the NEH is vulnerable unless legislators understand and value the role of the humanities in a strong democracy -- just as university humanities programs are vulnerable in state funding contexts when legislators, boards of trustees, or voters don't have a clear understanding of the value of the humanities in the culture and in the workplace.
In a career spent in higher education in the humanities, most of it at a liberal arts college, I rarely had to justify teaching what I taught. The value of an English major was self-evident to my colleagues and my students. Sure, the occasional parent would squeak, "But how will she make a living?" But I never hesitated to reassure the anxious check-writers of the value of our product. Having worked in the worlds of both journalism and Washington nonprofits, I knew how many good jobs demanded only a bachelor's degree, writing skills, research and analytic abilities, and common sense.
But then came the Great Recession and what many are calling the end of the higher education bubble. Questions about tuition increases, student debt, and colleges' lack of accountability (that is, the paucity of data on employment for recent graduates) get attached, in public perception, to the unemployment rate and to a re-emergence of the old post-Sputnik fears that the nation is not training enough folks in STEM fields.
Organizations such at the Association of American Colleges and Universities have been proactive in making the case for liberal learning as preparation for good citizenship, pointing to its employers' surveys. They have found that employers believe that the skills colleges should focus on improving are: written and oral communication; critical thinking and analytic reasoning; the application of knowledge and skills in real-world settings; complex problem solving; ethical decision making, and teamwork skills. These skills are not exclusive to the humanities, but they certainly line up with the student learning outcomes in humanities instruction at my institution.
It's not as if defenders of the values of a liberal arts education are ignoring economic realities: many liberal arts colleges are adding business majors, humanities fields are requiring internships and experiential learning, and colleges and universities are scrambling to make contact with successful alumni and to gather post-graduation employment data.
There's nothing wrong with linking liberal arts education in general, and the humanities in particular, to work. The humanities are inextricably linked to work and to U.S. civic life. When Lyndon Johnson signed legislation to bring the NEH into existence in 1965, it was in a context in which the federal government was pushed to invest in culture, as it had in science. NEH's account of its own history explains that the head of the Atomic Energy Commission told a Senate committee: "We cannot afford to drift physically, morally, or esthetically in a world in which the current moves so rapidly perhaps toward an abyss. Science and technology are providing us with the means to travel swiftly. But what course do we take? This is the question that no computer can answer."
Through my role in public humanities, I have come to understand that the humanities are what allow us to see ourselves as members of a civic community. Public history, public art, shared cultural experiences make us members of communities. This link has not been stressed enough in defense of the academic humanities. It's past time to make this important connection -- to help our boards of trustees, our communities, and our legislators to know what the humanities brings to civil society and gives to students as they enter the workforce.
In the first class I ever taught as a teaching assistant, I did my first lecture on Death of a Salesman. My topic was work -- how Willy's job is his identity. I pointed to a student I knew in the 150-student lecture hall and told him that his surname, Scribner, probably indicated the employment of some ancestor of his, a "scrivener," like Bartleby. Then I asked who else had last names that might have indicated a job. We had Millers and Coopers and Smiths, and many more.
When those students' ancestors worked as barrel-makers or at their forges, they worked those jobs for life, and their sons afterward did the same. But how many of us do the job our parents did? How many of our students will do the same job in their 30s that they will do in their 20s? Narrow ideas about work force preparation will not prepare our students for the work of the rest of their lives. Each job they take will train them in the skills they need to succeed in that particular industry. But a broad, liberal education will have been what made them people worth hiring, people who have learned the value of curiosity, initiative, problem-solving. Students in STEM fields and students in arts, social sciences, and humanities all will become members of communities, and a good background in the humanities will enrich their membership.
I loved the humanities as an English professor. But it was only when I became involved in public humanities that I began to understand their value not just for individuals but for communities. That's the public good. And that's why we cannot afford to let a narrow rhetoric of work force preparation push the humanities from our curriculums or defund the work of the National Endowment for the Humanities.
Paula M. Krebs is dean of the College of Humanities and Social Sciences at Bridgewater State University, in Massachusetts, and a member of the board of directors of the Rhode Island Council of the Humanities.
Teaching with PowerPoint has been an exercise in frustration for me. I find that my course preparation takes twice as long as it should, and the results are more often than not unsatisfying. It also makes me feel muffled and absent from the classroom. Maybe this is a function of my poor PowerPoint form, of being a latecomer to a technology that younger faculty use with more ease and panache. In a way, it’s not surprising that I would struggle with it. Although I’m young and pretty tech-savvy at 43, I can’t associate PowerPoint with my lived experiences as a learner. I spent my whole life as a student, from kindergarten through graduate school, plucking words out of the air to put them in my notebook, or following along as my teachers scribbled on the blackboard. The most technology-forward moments involved the occasional projection of transparencies in science classes.
Last semester I decided to conduct an experiment. For years, even before becoming a PowerPoint user, my chalkboard form had suffered from a lack of discipline and focus. What if I really rededicated myself to it? I decided to make writing on the chalkboard my primary method and PowerPoint my secondary tool. The outcome of the exercise was fantastic. I felt like I was waking up from being half-asleep as a teacher.
One of the things I liked the most about the experience was how using the chalkboard freed me to be more responsive to the needs of my students. Although I always came to class with an outline of notes to write on the board, I knew that it was changeable and schematic, subject to revision by student comments and questions. If you compared my paper notes with what actually went on the chalkboard you’d discover all kinds of emendations and additions. The chalkboard encouraged me to be more attentive to classroom conversations, to be more confident about changing my script.
Using the chalkboard also encouraged me to package or process information for my students in more versatile ways. I could come to class and write bullet points on the board as a starting point, then while interacting with my students, proceed to annotate with symbols (asterisks, arrows, underlining). If they still didn’t get it, I could erase and diagram, or erase and do a flow chart. The chalkboard is dynamic, changeable, sensitive, immediate, and completely in the classroom moment. It models note taking and underlines the value of trial and error thinking and brainstorming, skills that are vital to analytical thinking.
I also appreciated the chalkboard because it is an embodied kind of learning. It synchs the bodies of the students to the movement of the body of the instructor. The fact that there is no PowerPoint file to download or pass out, and that the eraser is eventually coming around, means that the class gets in a rhythm of following the movements of the instructor. There is a ritual of collective focus and activity. The instructor has to be much more physically present because writing on the chalkboard requires choreography, gesture and tempo. This is of practical value but there’s also something deeper. In an existence increasingly defined by the virtual, it is important to reassert physical presence.
At the end of class, I sometimes looked at the board before erasing it. So this is what had happened in class in the last hour! I could see the vague outlines of my original plan overlaid with symbols of emphasis and additions that had emerged through classroom conversations. Here it was: the exciting record of a collaborative enterprise between teacher and students. The board recorded an event that could never be repeated in precisely the same way, even if I used the same notes to try to do so.
All of this may seem ridiculous if you teach in a pedagogical ecosystem where chalkboards are still prominent. On my campus, it seems like everyone uses PowerPoint. The situation is so pervasive that once I noticed that student pens only went up when the PowerPoint was projected on screen. If I wrote a series of items on the board, not very many students wrote them down. In their minds, PowerPoint was the chalkboard and the chalkboard was just a piece of furniture. All my colleagues, in talking about course preparation, use the word PowerPoint: I was up late preparing my PowerPoints … I left my PowerPoint at home … I couldn’t finish my PowerPoint today in class.
In my circles you can’t use the word "blackboard" as a synonym for chalkboard because everyone will assume you’re referring to our learning management system. This last detail is probably the most symbolically telling: in spite of hundreds of years of use, and its iconic stature as a symbol of the classroom, the word "blackboard" has been hollowed out by a corporation.
The problem with educational technology when it becomes institutionalized and naturalized is that it easily becomes a crutch rather than an instrument to enhance community and interaction between human being. What is brilliant about José Bowen’s well known "Teaching Naked" concept is that it affirms technology as a tool for enhancing a humanistic classroom interaction. Interest in PechaKucha and Prezi, screen projection formats and templates that discard the stale formulas of conventional PowerPoint, underscores that instructors and presenters everywhere recognize that we need to allow for creativity and responsiveness in our use of educational technology. We are at our best as teachers when we question the tools we are given and reinvent them. This happens everyday in thousands of classrooms when innovative teachers bend PowerPoint to their will, instead of the opposite. The real software behind any instructional technology is the instructor; don’t underestimate her ability to elevate a rudimentary tool or ruin a promising and sophisticate one.
I’m not arguing against PowerPoint tout court. Heck, I plan on continuing to use it as one tool among others. I am just suggesting that the old chalkboard still has something to teach us. If you haven’t tried it recently, you should. It’s the latest thing and you don’t have to plug it into an outlet or find a network to use it.
Christopher Conway is associate professor of modern languages at the University of Texas at Arlington, where he teaches courses in modern Latin American literature and culture.
Newly hired in a tenure-track position, you receive an e-mail from the university provost that reads, "You are appointed to the new Giant Ground Sloth Task Force."
You wonder what a group named for a prehistoric beast might do. Could the task force preserve a carcass found miraculously intact? Might the task force replicate sloth DNA to create a test-tube embryo? Could there be a living giant ground sloth somewhere, plodding along merrily because it doesn’t know it’s extinct?
You dash into the first meeting and see on the conference table a large plate of glazed doughnuts unlike anything you’ve encountered at a faculty gathering. You take a doughnut, glance around, and realize you are the only instructor in the room.
The person in charge announces, "I am your Special Outside Consultant. We’re here to discuss the pros and cons of replacing your university’s traditional mascot, Polly Polyp, with a new creation, Sleepy the Giant Ground Sloth."
You ask, "Why change the mascot?"
"Polyps are immobile blobs,” explains another member of the task force, the Associate Director of Sporting Events. "At games, Polly Polyp doesn’t run around or jump up and down, but stands perfectly still."
"The task force must decide whether a more mobile mascot would attract more students," says a third person, the Co-Director of In-State Recruiting.
You take a bite from your glazed doughnut and feel inspired by the glucose rush. "I have an idea for recruiting," you declare. "Our university’s mission statement says that we promote global awareness, doesn’t it?"
You hear furious clicking as everyone calls up the mission statement.
"Yes, it does," exclaims the Chief Adviser to the Associate Chancellor.
You say, "All universities make that claim, but let’s require our undergrads to take two years of one language other than English and one year of another language. We could stipulate at least one of the two must be from outside the Indo-European language family. If we did that, we could advertise that we prepare people to participate in international affairs."
"I don’t feel that we could market that concept," says the Coordinator of Full-Pay Student Recruiting. "Our new campaign is called Fun for You at the U."
You take another bite of the glazed doughnut and ask, "Doesn’t our university’s mission statement claim that we turn students into better citizens?"
Again you hear furious clicking. The Assistant to the Assistant Vice Provost declares, "Indeed it does!"
You say, "If fun is the recruiting theme, how about a required first-year course called Fun With Public Issues in which students enjoy hunting for fallacies in discourse? They could go on to Fun With National Issues, Fun With International Issues and Fun With Special Topics Issues. Each year every level could have a contest to see who could find the most ridiculous statement made by a public official."
"We already have the majority of our classes taught by part-timers," says the Assistant Dean of Intermittently Employed Professionals. "We couldn’t hire a hundred more adjuncts to teach that many sections every semester."
You polish off your treat and feel the courage that only inexperience and a glazed doughnut can bring. You say, "Our mission statement claims we value excellence of instruction, true?"
Once more furious clicking fills the room.
"True," announces the Co-Director of Large Gift Acceptance.
You say, "In the next decade we’re supposed to produce thousands more college graduates than ever before. To do that, the university plans to dump more work on part-timers, true?"
"Perhaps," says the Interim Coordinator of External Public Relations.
You say, "All universities will face this problem, but let’s get ahead of the others. Let’s transform those part-time positions into tenure-track slots."
"Impossible! We don’t have enough offices for that many additional full-time instructors," says the Associate Vice President of Space Allocation.
You try another glazed doughnut and ask a new question. "Why replace Polly, an immobile mascot, with a giant ground sloth named Sleepy?"
"We don’t want to offend alumni who identify with an immobile mascot, so we thought we might introduce one that moves, but only a little bit, and very slowly," says the Assistant to the Full Director of Alumni Satisfaction. "If Sleepy goes over, in 10 or 15 years we’ll try something more active."
Brent Chesley is a professor of English at Aquinas College, in Michigan.
Last year Temple University Press published Toby Miller's Blow Up the Humanities, a book that starts straining for provocation with its title and never lets up. The author is a professor of media and cultural studies at the University of California at Riverside. His preferred rhetorical stance is that of the saucy lad -- pulling the nose of Matthew Arnold and not fooled for a minute by all that “culture as the best which has been thought and said” jazz, man.
What we must recognize, his argument goes, is that there are two forms of the humanities now. What the author calls "Humanities One" (with literature, history, and philosophy at their core) is just the privileged and exclusionary knowledge of old and dying elites, with little value, if any, to today’s heterogeneous, globalized, wired, and thrill-a-minute world. By contrast, we have studies of mass media and communications making up “Humanities Two,” which emerged and thrived in the 20th century outside “fancy schools with a privileged research status.”
In the future we must somehow establish a third mode: “a blend of political economy, textual analysis, ethnography, and environmental studies such that students learn the materiality of how meaning is made, conveyed, and discarded.” Enough with the monuments of unaging intellect! Let the dead bury the dead; henceforth, culture must be biodegradable.
What I chiefly remember about Blow Up the Humanities, a few months after reading it, is exclaiming “What a cheeky monkey you are!” every few pages -- or at least feeling like this was expected of me. Otherwise it mostly seemed like vintage cultural-studies boilerplate. But one passage in the book did strike me as genuinely provocative. It takes the form of a footnote responding to Google’s claim of a "commitment to the digital humanities." Here it is, in full:
“In the United States, ‘the digital humanities’ can mean anything from cliometric analysis to ludic observation. It refers to a method of obtaining funds for conventional forms of Humanities One, dressed up in rather straightforward electronic empiricism. So counting convicts in law reports or references to Australia in Dickens becomes worthy of grant support because it is archival and computable.”
A scrawl in the margin records my immediate response upon reading this: “Cute but misleading.” But now, on second thought… Well, actually “cute but misleading” pretty well covers it. The caricature of the digital humanities might have been recognizable a dozen years ago, though just barely even then. What makes Miller’s polemical blast interesting is the angle of the assault. For once, a complaint about the digital humanities isn’t coming from traditionalist, semi-luddite quarters -- “traditionalist” with regard to the objects of study (i.e., books, manuscripts, paintings) if not necessarily the theories and methods for analyzing them.
On the contrary, Miller regards video games as a rich cultural medium, both profitable and profound. To shore up his claims for Humanities Two (or, fingers crossed, Three) he finds it useful to pretend that the digital humanities will, in effect, take us back to the era of professors tabulating Chaucer’s use of the letter “e.” The scholarship will be more efficient, if no less dull.
Now, I have no interest in impeding the forward march of Angry Birds studies, but there is no way that Miller doesn’t know better. The days when humanities computing was used to count dead convicts are long gone. Much more likely now would be a project in which all of the surviving files of Victorian prisons are not simply rendered searchable but integrated with census data, regional maps, and available documentation of riots, strikes, and economic trends during any given year.
MLA is a major component of the Humanities One infrastructure, of course, but has enough Humanities Two people in it to suggest that the distinction is anything but airtight. And while Miller pillories the digital humanities as nothing but “a method of obtaining funds for conventional forms of Humanities One,” even old-school philological practice takes on new valences in a digital environment.
“In the humanities,” write Charles Cooney, Glenn Roe, and Mark Olsen in their contribution, “scholars are primarily concerned with the specifics of language and meaning in context, or what is in the works. [Textbases] tend to represent specific linguistic or national traditions, genres, or other characteristics reflecting disciplinary concerns and scholarly expertise.… [T]extbases in the digital humanities are generally retrospective collections built with an emphasis on canonical works in particular print traditions.”
So far, so Humanities One-ish -- with only the neologism “textbase” to show that much has changed since Isaac Casaubon’s heroic proof that the Corpus Hermeticum wasn’t as ancient as everybody thought. Textbase just means “collection,” of course. For that matter, the options available in textbase design (the ways of annotating a text, of making it searchable, of cross-referencing it with other items in the textbase or even in other textbases) are basically high-tech versions of what scholars did four hundred years ago.
Alas, what Casaubon could do alone in his study now requires an interdisciplinary team, plus technicians. But he did not have the distractions we do.
If digital humanists were limited to converting cultural artifacts of the print era into textbases, that would still be useful enough, in its way. The classics aren’t going to annotate themselves. But the warehouse is much larger than that. Besides the inherited mass of documents from the past 5,000 years, more and more texts are now “born digital.” Besides warehousing and glossing such material, the digital humanities incorporate the changes in how people receive and engage with cultural material, as Alan Liu discusses in “From Reading to Social Computing,” his essay for the MLA anthology.
What Liu calls “the core circuit of literary activity” – the set of institutions, routines, and people involved in transmitting a poem (or whatever) from the author’s notebook to the reader's eyeballs – has been reconfigured dramatically over the past two decades. Besides making it possible to publish or annotate a text in new ways, the developing communication system transforms the culture itself. The digital humanist has to map, and remap, the very ground beneath our feet.
Nor is that a new development. Other papers in the anthology will give you a sense of how the digital humanities have developed over the long term -- beginning when Roberto Busa started using a computer to prepare an exhaustive concordance of Thomas Aquinas in the 1940s. At some point, an important change in the digital humanities will be necessary, which is to drop the word "digital."
(Note: This essay has been updated from an earlier version to correct Toby Miller's name.)
The faculty in postsecondary education has changed so much in the last 20 years that it has been labeled a "revolution" by researchers who study the professoriate. More than two-thirds of the faculty providing instruction in nonprofit higher education are currently employed off the tenure track, and their numbers continue to rise. This shift alone may be cause for concern, but the real dilemma is that institutions have not developed a new faculty model or employment practices that are based on a realistic conception of the faculty and its composition. The faculty model currently in use has not been achieved through intentional and thoughtful planning. It is the haphazardly derived product of casual, short-term planning and reactionary decision making amid constrained budgets; it reflects little thought or concern for its implications for student learning or enlightened employment practice.
Today, many faculty members have no job security or expectation of employment beyond the current term. Many do not receive benefits and their compensation is extremely low, averaging $2,700 per course, making it difficult to earn a living wage even when they can get consistent work. Sometimes, however, they cannot obtain a full course load. Institutional policies and practices often make them ineligible for unemployment when this occurs. Recent reporting has exposed that some faculty members are living on food stamps. Only 25 percent of non-tenure-track faculty have any form of health insurance, and even those covered often have less than adequate coverage.
Even basic forms of institutional support that could improve faculty performance -- and, by extension, enhance their capabilities to promote student learning -- are lacking. As a result of our failure to acknowledge and address the changing faculty, we have made it unnecessarily difficult for a majority of the faculty to do their jobs. Non-tenure-track faculty members – particularly part-time faculty members – often do not receive an orientation, professional development or mentoring, and they may even be excluded from faculty meetings. So they may not understand institutional goals, learn about pedagogies for effectively educating the students they teach, or have opportunities to strengthen their skills.
Only a very few are involved in curriculum design and governance, even though they may outnumber tenure-track faculty or teach a majority of the credit hours at their institutions. They typically lack office space and may not receive compensation for conducting office hours to support their students. Additionally, hiring decisions are routinely made at the last minute, often within days of a class beginning. Making matters worse, institutions do not always provide these faculty members with adequate materials or resources, including a sample syllabus, to help them to prepare on such short notice.
This model constrains faculty members’ ability to provide a quality learning environment and make their maximum contribution to educating students. There is now evidence that the poor working conditions we impose upon them have an adverse effect on student retention, transfer, and graduation rates, as well as other indicators of learning and student success. Much of the employment literature addresses the need for employees to be motivated and well-trained, but also to have access to basic resources, materials, supplies, and conditions that allow them to perform their duties. Adjuncts have been robbed of the opportunity to give their best effort for their students. With this evidence close at hand and the moral objections inherent in a model that would leave employees without a living wage or safety net becoming clearer, it seems there would be more significant outrage or at least concern within our academic community.
Adjuncts have been writing about their poor working conditions for years. They have done so with trepidation, as many commentators have demonized them as being the root of the problem, rather than recognizing the effects of this poor employment model or the conditions they endure. Yet they continue to lend their voices to the just cause of change.
Why have so few outside these ranks taken up this cause? While non-tenure-track faculty have been vocal in advocating for change, virtually no institutional, foundation, or policy leaders have acknowledged the hard realities of these conditions or expressed concern. In fact, in private, a few postsecondary leaders will note that they feel bad and think the model is morally bankrupt. In public, though, they often show no leadership, nor do they voice their objections to a model that surely cannot be sustained -- nor should it be.
As a result, institutions, foundations, and government pour billions of dollars into initiatives for completion and success, many of which cannot succeed because they fail to understand the faculty responsible for carrying out changes designed to improve the learning environment. Goals for improving access and outcomes are severely affected. We can blame decreasing funding and external pressures. However, many institutions have had a choice and still shifted money away from instruction to fund other priorities. Others, particularly community colleges, are sometimes so lacking in resources that they have been given no options.
This cannot continue. Ours should be an ethical employment model with integrity – one that allows us to draw upon the strengths of all our faculty to create and sustain a high-quality learning environment to best serve students. Today, we raise these concerns; in a short time, so too will a public dissatisfied with the inaction and inattention of our leaders to these problems. So we invite leaders from across the country to join the Delphi Project on the Changing Faculty and Student Success not only in calling for changes, but in helping to create new solutions to this problem now – to challenge the status quo and advance a new employment model for higher education that has integrity.
We applaud the leaders that have joined us so far, including the Western Interstate Commission on Higher Education, Association of American Colleges &Universities, the New Faculty Majority, American Association of Community Colleges, American Federation of Teachers, League of Innovation, Council for Higher Education Accreditation, Association for Governing Boards, National Association for College and University Business Officers, State Higher Education Executive Officers, various disciplinary societies, and others (listed on our website). We hope you will visit our website and utilize the resources we have prepared to begin to address and move away from this unethical employment model.
Adrianna Kezar, David Longanecker and Daniel Maxey
Adrianna Kezar is a professor at the University of Southern California and director for the Delphi Project for the Changing Faculty and Student Success.
David Longanecker is president of the Western Interstate Commission on Higher Education.
Daniel Maxey is a doctoral student at the University of Southern California.