America's public research universities face a challenging economic environment characterized by rising operating costs and dwindling state resources. In response, institutions across the country have looked toward the corporate sector for cost-cutting models. The hope is that implementing these “real-world” strategies will centralize redundant tasks (allowing some to be eliminated), stimulate greater efficiency, and ensure long-term fiscal solvency.
Recent events at the University of Michigan (suggest that faculty should be proactive in the face of such “corporatization” schemes, which typically are packaged and presented as necessary and consistent with a commitment to continued excellence. The wholesale application of such strategies can upend core academic values of transparency, and shared governance, and strike at the heart of workplace equity.
Early this month our university administration rolled out the “Workforce Transition” phase of its “Administrative Services Transformation” (AST) plan. From far on high, with virtually no faculty leadership input, 50 to 100 staff members in the College of Literature, Science, and the Arts (LS&A) departments were informed that their positions in HR and finances (out of an anticipated total of 325) would be eliminated by early 2014. Outside consultants, none of whom actually visited individual departments for any serious length of time, reduced these positions to what they imagined as their “basic” functions: transactional accounting and personnel paperwork.
It became clear that many of those impacted constitute a specific demographic: women, generally over 40 years of age, many of whom have served for multiple decades in low- to mid-level jobs without moving up the ranks. A university previously committed to gender equity placed the burden of job cuts on the backs of loyal and proven female employees.
These laid-off employees found little comfort in learning that they would be free to apply for one of 275 new positions in HR or finance that will be contained at an off-campus “shared services” center disconnected from the intellectually vital campus life.
The resulting plan reveals no awareness of how departments function on an everyday basis. Such “shared services” models start with the presumption that every staff member is interchangeable and every department’s needs are the same. They frame departments as “customers” of centralized services, perpetuating the illusion that the university can and should function like a market. This premise devalues the local knowledge and organic interactions that make our units thrive. Indeed, it dismisses any attribute that cannot be quantitatively measured or “benchmarked.” Faculty members who reject these models quickly become characterized as “change resisters”: backward, tradition-bound, and incapable of comprehending budgetary complexities.
The absence of consultation with regard to the plan is particularly galling given that academic departments previously have worked well with the administration to keep the university in the black. Faculty members are keenly aware of our institution’s fiscal challenges and accordingly have put in place cost-cutting and consolidating measures at the micro level for the greater good.
Worries about departmental discontentment with AST and shared services resulted in increasing secrecy around the planned layoffs. In an unprecedented move, department chairs and administrators were sworn to silence by “gag orders” prohibiting them from discussing the shared services plan even with each other. Perturbed, close to 20 department chairs wrote a joint letter to top university executives expressing their dismay. As one department chair said, "The staff don't know if they can trust the faculty, the faculty don't know if they trust the administration.”
Within a few days, at least five LS&A departments had written collective letters of protest, signed by hundreds of faculty members and graduate students. Over the past few weeks, that chorus of opposition has only intensified as faculty members from all corners of our campus have challenged AST. Some have called for a one- to two-year moratorium and others for an outright suspension of the program.
The outcry against the planned transition itself reflects the growing rift between departmental units and the central administration at the University of Michigan. Championed as an astute financial fix by a cadre hidden away in the upper-level bureaucracy, the shared-services model is the brainchild of Accenture, an outside consulting firm which our university has also contracted for a multimillion-dollar IT rationalization project.
Caught off-guard by the strong pushback, the administration has issued several messages admitting that their communication strategies around these changes were inadequate, stating that for now layoffs will be avoided, and assuring us that there will be greater consultation and transparency going forward.
While these definitely are hopeful signs, important questions about institutional priorities and accountability have arisen.
Initially, the university’s consultants claimed that AST would render a savings of $17 million. Over time that figure shrunk to $5 million, and by some accounts now is reputed to be as low as $2 million. Yet the university has already reportedly spent at least $3 million on this effort with even more spending on the horizon.
Where are the cost savings? How much more will the university spend on Accenture and other outside consultants? How will replacing or shifting valued employees, even at lower numbers and salaries, from their departmental homes to what essentially is a glorified offsite “call center” actually enhance efficiency? How can a university ostensibly committed to gender equity justify making long-serving and superb female employees pay the price of AST? What credible proof is there that centralized management will provide any budgetary or administrative benefits to the specialized needs of individual departments?
The implications of these questions are thrown into starker relief when considering that almost to the day of the announced layoffs, the university launched its most ambitious capital campaign, “Victors for Michigan,” with festivities costing more than $750,000 and a goal of raising $4 billion.
Whether or not the collective protest initiated by a critical mass of faculty will result in change or reversal remains to be seen. Nevertheless, the past few weeks have been a wake-up call. Faculty must educate themselves about the basic fiscal operations of the institution in these changing times and reassert their leadership. Gardens, after all, require frequent tending.
Otherwise, we remain vulnerable to opportunistic management consultants seeking to use fiscal crisis as a source of profit. Public institutions that remain under the spell of misleading corporate promises will ultimately save little and lose a great deal.
Anthony Mora is associate professor of American culture and history at the University of Michigan. Alexandra Minna Stern is professor of American culture and history, and a professor of obstetrics and gynecology at the University of Michigan.
For some reason I have become aware that it is possible to take photographs of bass guitar players in mid-performance and, by digital means, to replace their instruments with dogs, so that it then appears the musicians (who very often wear a facial expressions suggesting rapture or deep concentration) are tickling the dogs. Yes, yes it is.
I am not proud of this knowledge and did not seek it out, and would have forgotten about it almost immediately if not for something else occupying my attention in the past few days: a couple of new books treating the phenomenon with great and methodical seriousness. Not, of course, the dog-tickling bass player phenomenon as such, but rather, the kind of online artifact indicated by the titles of Karine Nahon and Jeff Hemsley’s Going Viral (Polity) and Limor Shifman’s Memes in Digital Culture (MIT Press).
The authors differentiate between the topics of the two volumes. Despite a common tendency to equate them, memes don’t always “go viral.” Things that do (say, video shot during a typhoon, uploaded while the disaster is still under way) are not always memes. The distinction will be clarified shortly -- and there is indeed some value in defining the contrast. It corresponds to different kinds of behavior or, if you prefer, different ways of mediating social and cultural life by means of our all-but-inescapable digital device.
Still, the line should be drawn only just so sharply. It seems bright and clear when the authors bring their different methods (one more quantitative than qualitative and vice versa) to the job. I don’t mean that the difference between viral and memetic communication is simply one of perspective. It seems to exist in real life. But so does their tendency to blur.
“Virality,” write Nahon and Hemsley in a definition unlikely to be improved upon, “is a social information flow process where many people simultaneously forward a specific information item, over a short period of time, within their social networks, and where the message spreads beyond their own (social) networks to different, often distant networks, resulting in a sharp acceleration in the number of people who are exposed to the message.” (Nahon is an associate professor, and Hemsley a Ph.D. candidate, at the Information School of the University of Washington.
Here the term “information item” is used very broadly, to cover just about any packet of bytes: texts, photographs, video, sound files, etc. It also includes links taking you to such material. But unlike a computer virus -- an unwanted, often destructive such packet – a message that has “gone viral” doesn’t just forward itself. It propagates through numerous, dispersed, and repeated decisions to pay attention to something and then circulate it.
The process has a shape. Charting on a graph the number of times a message is forwarded over time, we find that the curve for a news item appearing at a site with a great deal of traffic (or a movie trailer advertised on a number of sites) shoots up at high speed, then falls just about as rapidly. The arc is rapid and smooth.
By contrast, the curve for an item going viral is a bit more drawn-out -- and a lot rougher. It may show little or no motion for a while before starting to trend upwards for a while (possibly followed by a plateau or downturn or two) until reaching a certain point at which the acceleration becomes extremely sharp, heading to a peak, whereupon the number of forwards begins to fall off, more or less rapidly -- with an occasional bounce upwards perhaps, but nothing so dramatic as before.
So the prominently featured news item or blockbuster ad campaign on YouTube shoots straight up, like a model rocket on a windless day, until the fuel (newsworthiness, dollars) runs out, whereupon it stops, then begins to accelerate in the opposite direction. But when something goes viral, more vectors are involved. It circulates within and between clusters of people -- individuals with strong mutual connections with each other. It circulates through the networks, formal or informal, in which those clusters are embedded.
And from there, onward and outward – whether with a push (when somebody with a million Twitter followers takes notice), or a pull (it begins to rank among top search-engine results on a certain topic), or both. The authors itemize factors in play in decisions about whether or not to share something: salience, emotional response, congruence with the person’s values, etc. And their definition of virality as “a social information flow process” takes into account both the horizontal dimension of exchange (material circulating spontaneously among people familiar with one another) and the roles of filtering and broadcasting exercised by individuals and online venues with a lot of social capital.
None of which makes virality something that can be planned, however. “Content that we create can remain stubbornly obscure even when we apply our best efforts to promote it,” they write. “It can also grow and spread with an apparent life and momentum of its own, destroying some people’s lives and bringing fame and fortune to others, sometimes in a matter of days.”
An Internet meme, as Limor Shifman sums things up, is “(a) a group of digital items sharing common characteristics of content, form, and/or stance; (b) that were created with awareness of each other; and (c) were circulated, imitated, and/or transformed via the Internet by many users.”
As with virality, the concept rests on a biological metaphor. Coined by Richard Dawkins in 1976, “meme” began in a quasi-scientific effort to identify the gene-like elements of behavior, cultural patterns, and belief systems that caused them to persist, expand, and reproduce themselves over very long periods of time. As reincarnated within cyberculture, the meme is a thing of slighter consequence: a matter of endless variation on extremely tenacious inside jokes, occupying and replicating within the brains of bored people in offices.
Shifman's point that memetic communication (which for the most part involves mimicry of existing digital artifacts with parodic intent and/or "remixing" them with new content) is an exemplary case of Web 2.0 culture seems to me sound, which probably also explains why much in the book may seem familiar even to someone not up on LOLcats studies. Yes, memes are a form of active participation in digital communication. Yes, they can carry content that (whether the meme goes viral or not) questions or challenges existing power structures. I have seen my share of Downfall parody videos, and am glad to know that Bruno Gantz is okay with the whole thing. But every so often that line from Thoreau comes to mind -- "as if we could kill time without injuring eternity" -- and it seems like a good idea to go off the grid for a while.
Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78
The liberal arts are dead, or — at best — dying. That's the theme of story after story in today’s news media.
Professional skills training is in. The STEM (science, technology, engineering, and math) fields are in. Practical, vocational higher education is in. The liberal arts are out, relics of a “traditional” way of thinking that has been overtaken by the pressing demands of our dizzyingly complex digital age.
As new students arrived on college campuses this fall, the message many of them heard is that majoring in history, or English, or anthropology is a surefire recipe for a life of irrelevance and poor job prospects. These “conventional” disciplines cannot possibly train students for productive, enriching careers in the high-tech information age whose future is now.
Although this viewpoint is rapidly gaining the status of settled wisdom, it is tragically misguided. It is based on a false dichotomy, namely that the liberal arts and the more vocational, preprofessional, practical disciplines — like, say, computer science — are fundamentally different and opposed. But this misunderstands both the age we’re living in and the challenges we face, not to mention one of the most significant trends in higher education over the last few decades — the evolution of interdisciplinarity.
In essence, this whole debate comes down to skills. The liberal arts are often said by critics to provide little that is of “practical value” in the “real world.” In reality, though, liberal arts curriculums can and do give students skills that are just as professionally useful as those in more “relevant” occupationally specific fields of study.
At my university, the University of Maryland-Baltimore County, students this fall can declare a new major called global studies, which integrates courses in 12 liberal arts departments — including economics, geography and environmental systems, history, media and communication studies, and political science — into a rigorous interdisciplinary curriculum. Majors are required to study abroad and to achieve fluency in at least one foreign language. By graduation, they will have demonstrated their research, analytical, critical-thinking, and writing skills in a substantial, “capstone” research project. Our students will also do internships with companies, not-for-profits, and government agencies.
Equally important, they will develop “global competence,” which employers in many professions have identified as one of the most desirable, but grossly lacking, sets of skills required of their new employees. Broadly defined, global competence is “the capacity and disposition to understand and act on issues of global significance.” Its central elements include knowledge of world affairs — cultural, economic, and political; proficiency in communicating with people in and from other societies, both verbally and in writing; the ability to appreciate multiple perspectives and respect cultural diversity; and the intellectual and psychological flexibility to adapt to unfamiliar and rapidly changing circumstances.
Developing the skills that we hope to instill in UMBC’s global studies majors is an inherently interdisciplinary mission. In a recent New York Timescolumn, Yale professor Nicholas Christakis argues that the social sciences (a subset of the liberal arts) badly trail the natural sciences in generating innovative “institutional structures” that can produce the kind of cutting-edge science necessary for solving some of the world’s most intractable — often intrinsically interdisciplinary — problems. However, he also notes that this is beginning to change, for example, in the form of a new global affairs major at Yale.
Whether it’s global studies at UMBC or global affairs at Yale, these exciting new programs tangibly articulate why talking about liberal arts education versus practical training creates the false perception that these two enterprises are essentially at odds. At UMBC, it's the combination of interdisciplinary liberal arts education; substantial research, writing and analysis; rigorous foreign language training; study abroad; and experiential learning in the form of internships and other applied opportunities that will give students the skills they will need to thrive and “do good” in the 21st century.
The tragedy is that we might blow it. If we continue to present students with a false choice between the liberal arts and “real-world” vocational training, we will produce what social scientists like to call “suboptimal” outcomes. Too many talented, energetic, hard-working students will choose “safe” educational and career paths, and too many truly global problems will go unsolved.
Devin T. Hagerty is a professor of political science and director of global studies at the University of Maryland-Baltimore County.