History

Last Bastion of Liberal Education?

Why do narratives of decline have such perennial appeal in the liberal arts, especially in the humanities?  Why is it, year after year, meeting after meeting, we hear laments about the good old days and predictions of ever worse days to come?  Why is such talk especially common in elite institutions where, by many indicators,  liberal education is doing quite well, thank you very much.  I think I know why.  The opportunity is just too ripe for the prophets of doom and gloom to pass up.

There is a certain warmth and comfort in being inside the “last bastion of the liberal arts,” as  B.A. Scott characterized prestigious colleges and research universities in his collection of essays The Liberal Arts in a Time of Crisis (NY Praeger, 1990). The weather outside may be frightful, but inside the elite institutions, if not “delightful,” it’s perfectly tolerable, and likely to remain so until retirement time.

Narratives of decline have also been very useful to philanthropy, but in a negative way.  As Tyler Cowen recently noted in The New York Times, “many donors … wish to be a part of large and successful organizations -- the ‘winning team’ so to speak.” They are not eager to pour out their funds in order to fill a moat or build a wall protecting some isolated  “last bastion.” Narratives of decline provide a powerful reason not to reach for the checkbook. Most of us in the foundation world, like most other people, prefer to back winners than losers. Since there are plenty of potential winners out there, in areas of pressing need, foundation dollars have tended to flow away from higher education in general, and from liberal education in particular.

But at the campus level there’s another reason for the appeal of the narrative of decline, a genuinely insidious one. If something goes wrong the narrative of decline of the liberal arts always provides an excuse. If course enrollments decline, well, it’s just part of the trend.  If students don’t like the course, well, the younger generation just doesn’t appreciate such material. If the department loses majors, again, how can it hope to swim upstream when the cultural currents are so strong?  Believe in a narrative of decline and you’re home free; you never have to take responsibility, individual or collective, for anything having to do with liberal education.  

There’s just one problem. The narrative of decline is about one generation out of date and applies now only in very limited circumstances. It’s true that in 1890, degrees in the liberal arts and sciences accounted for about 75 percent of all bachelor’s degrees awarded; today the number is about 39 percent, as Patricia J. Gumport and  John D. Jennings noted in “Toward the Development of Liberal Arts Indicators” (American Academy of Arts and Sciences, 2005). But most of that decline had taken place by 1956, when the liberal arts and sciences had 40 percent of the degrees. 

Since then the numbers have gone up and down, rising to 50 percent by 1970, falling to 33 percent by 1990, and then rising close to the 1956 levels by 2001, the last year for which the data have been analyzed. Anecdotal evidence, and some statistics, suggest that the numbers continue to rise, especially in  Research I universities.  

For example, in the same AAA&S report ("Tracking Changes in the Humanities) from which these figures have been derived, Donald Summer examines the University of Washington (“Prospects for the Humanities as Public Research Universities Privatize their Finances”) and finds that majors in the humanities have been increasing over the last few years and course demand is strong.

The stability of liberal education over the past half century seems to me an amazing story, far more compelling than a narrative of decline, especially when one recognizes the astonishing changes that have taken place over that time: the vast increase in numbers of students enrolled in colleges and universities,  major demographic changes, the establishment of new institutions, the proliferation of knowledge, the emergence of important new disciplines, often in the applied sciences and engineering, and, especially in recent years, the financial pressures that have pushed many institutions into offering majors designed to prepare students for entry level jobs in parks and recreation, criminal justice, and now homeland security studies. And, underlying many of these changes, transformations of the American economy.    

The Other, Untold Story

How, given all these changes, and many others too, have the traditional disciplines of the arts and sciences done as well as they have? That would be an interesting chapter in the history of American higher education. More pressing, however, is the consideration of one important consequence of narratives of decline of the liberal arts.

This is the “last bastion” mentality, signs of which are constantly in evidence when liberal education is under discussion. If liberal education can survive only within the protective walls of elite institutions, it doesn’t really make sense to worry about other places. Graduate programs, then, will send the message that success means teaching at a well-heeled college or university, without any hint that with some creativity and determination liberal education can flourish in less prestigious places, and that teaching there can be as satisfying as it is demanding.

Here’s one example of what I mean. In 2000, as part of a larger initiative to strengthen undergraduate liberal education,  Grand Valley State University, a growing regional public institution in western Michigan, decided to establish a classics department. Through committed teaching, imaginative curriculum design, and with strong support from the administration, the department has grown to six tenured and tenure track positions with about 50 majors on the books at any given moment. Most of these are first-generation college students from blue-collar backgrounds who had no intention of majoring in classics when they arrived at Grand Valley State, but many have an interest in mythology or in ancient history that has filtered down through popular culture and high school curricula. The department taps into this interest through entry-level service courses, which are taught by regular faculty members, not part timers or graduate students.

That’s a very American story, but the story of liberal education is increasingly a global one as well.  New colleges and universities in the liberal arts are springing up in many countries, especially those of the former Soviet Union.

I don’t mean that the spread of liberal education comes easily, in the United States or elsewhere. It’s swimming upstream. Cultural values, economic anxieties, and all too often institutional practices (staffing levels, salaries, leave policies and research facilities) all exert their downward pressure. It takes determination and devotion to press ahead. And those who do rarely get the recognition or credit they deserve.

But breaking out of the protective bastion of the elite institutions is vital for the continued flourishing of liberal education. One doesn’t have to read a lot of military history to know what happens to last bastions. They get surrounded; they eventually capitulate, often because those inside the walls squabble among themselves rather than devising an effective breakout strategy. We can see that squabbling at work every time humanists treat with contempt the quantitative methods of their scientific colleagues and when scientists contend that the reason we are producing so few scientists is that too many students are majoring in other fields of the liberal arts.  

The last bastion mentality discourages breakout strategies. Even talking to colleagues in business or environmental studies can be seen as collaborating with the enemy rather than as a step toward broadening and enriching the education of students majoring in these fields. The last bastion mentality, like the widespread narratives of decline, injects the insidious language of purity into our thinking about student learning, hinting that any move  beyond the cordon sanitaire is somehow foul or polluting and likely to result in the corruption of high academic standards.   

All right, what if one takes this professed concern for high standards seriously? What standards, exactly, do we really care about and wish to see maintained? If it’s a high level of student engagement and learning, then let’s say so, and be forthright in the claim that liberal education is reaching that standard, or at least can reach that standard if given half a chance. That entails, of course, backing up the claim with some systematic form of assessment.

That provides one way to break out of the last bastion mentality. One reason that liberal education remains so vital  is that when properly presented it contributes so much to personal and cognitive growth. The subject matter of the liberal arts and sciences provides some of the best ways of helping students achieve goals such as analytical thinking, clarity of written and oral expression,  problem solving, and alertness to moral complexity, unexpected consequences and cultural difference. These goals command wide assent outside academia, not least among employers concerned about the quality of their work forces. They are, moreover, readily attainable  through liberal education provided proper attention is paid to “transference.”  “High standards” in liberal education require progress toward these cognitive capacities.

Is it not time, then, for those concerned with the vitality of liberal education to abandon the defensive strategies that derive from the last bastion mentality, and adopt a new and much more forthright stance? Liberal education cares about high standards of student engagement and learning, and it cares about them for all students regardless of their social status or the institution in which they are enrolled.

There is, of course, a corollary. Liberal education can’t just make the claim that it is committed to such standards, still less insist that others demonstrate their effectiveness in reaching them, unless those of us in the various fields of the arts and sciences are willing to put ourselves on the line. In today’s climate  we have to be prepared to back up the claim that we are meeting those standards. Ways to make such assessments are now at hand, still incomplete and imperfect, but good enough to provide an opportunity for the liberal arts and sciences to show what they can do.

That story, I am convinced, is far more compelling than any narrative of decline.

Author/s: 
W. Robert Connor
Author's email: 
newsroom@insidehighered.com

W. Robert Connor is president of the Teagle Foundation and blogs frequently about liberal education.

Digital Masonry

Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”

Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)

When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.

He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)

Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.

Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.

It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)

Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.

“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”

In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.

For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)

The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.

The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.

But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.

Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality.
Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.

“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”

But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.

“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”

Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.

“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.”
As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.

“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”

(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)

“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”

Author/s: 
Scott McLemee
Author's email: 
info@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

The Forgotten Virtue of Gratitude

It was a typical 1970s weekday evening. The sky was growing dark and I, an elementary school student, was sitting at the kitchen table of a modest North Jersey cape cod putting the finishing touches on the day’s homework. The back door opened -- a telltale sign that my father was home from work. As he did every day, Dad stopped in the laundry room to take off his muddied work boots. As usual, he was tired. He could have been covered with any number of substances, from dirt to paint to dried spackle. His hands were rough and gnarled. I kissed him hello, he went to the bathroom to “wash up,” and my family sat down to eat dinner.

I always knew how hard my father worked each day in his job as a general contractor. When I got older I spent summers working with him. I learned the virtues of this kind of working class life, but I also experienced the drudgery that came with laying concrete footings or loading a dumpster with refuse. I worked enough with my father to know that I did not want to do this for the rest of my life. Though he never told me so, I am sure that Dad probably didn't want that for me, either.

I eventually became only the second person in my extended family to receive a college degree. I went on to earn a Ph.D. (a “post-hole digger” to my relatives) in history and settled into an academic life. As I enter my post-tenure years, I am grateful for what I learned from my upbringing and for the academic vocation I now pursue. My gratitude inevitably stems from my life story. The lives that my parents and brothers (one is a general contract and the other is a plumber) lead are daily reminders of my roots.

It is not easy being a college professor from a working-class family. Over the years I have had to explain the geographic mobility that comes with an academic life. I have had to invent creative ways to make my research understandable to aunts and uncles. My parents read my scholarly articles, but rarely finish them. My father is amazed that some semesters I go into the office only three days a week. As I write this I am coming off of my first sabbatical from teaching. My family never quite fathomed what I possibly did with so much time off. (My father made sense of it all by offering to help me remodel my home office, for which I am thankful!) “You have the life,” my brother tells me. How can I disagree with him?

Gratitude is a virtue that is hard to find in the modern academy, even at Thanksgiving time. In my field of American history, Thanksgiving provides an opportunity to set the record straight, usually in op-ed pieces, about what really happened in autumn 1621. (I know because I have done it myself!). Granted, as public intellectuals we do have a responsibility to debunk the popular myths that often pass for history, but I wonder why we can’t also use the holiday, as contrived and invented and nostalgic and misunderstood as it is, to stop and be grateful for the academic lives we get to lead.

Thanksgiving is as good a time as any to do this. We get a Thursday off from work to take a few moments to reflect on our lives. And since so many academics despise the shopping orgy known as “Black Friday,” the day following Thanksgiving presents a wonderful opportunity to not only reject consumer self-gratification, but practice a virtue that requires us to forget ourselves.

I am not sure why we are such an unthankful bunch. When we stop and think about it we enjoy a very good life. I can reference the usual perks of the job -- summer vacation, the freedom to make one’s own schedule, a relatively small amount of teaching (even those with the dreaded 4-4 load are in the classroom less than the normal high school teacher). Though we complain about students, we often fail to remember that our teaching, when we do it well, makes a contribution to society that usually extends far beyond the dozens of people who have read our recent monograph. And speaking of scholarship, academics get paid to spend a good portion of their time devoted to the world of ideas. No gnarled hands here.

Inside Higher Ed recently reported that seventy-eight percent of all American professors express “overall job satisfaction.” Yet we remain cranky. As Immanuel Kant put it, “ingratitude is the essence of vileness.” I cannot tell you how many times I have wandered into a colleague’s office to whine about all the work my college expects of me.

Most college and university professors live in a constant state of discontentment, looking for the fast track to a better job and making excuses as to why they have not landed one yet. Academia can be a cutthroat and shallow place to spend one’s life. We are too often judged by what is written on our conference name badges. We say things about people behind their backs that we would never say to their faces. We become masters of self-promotion. To exhibit gratefulness in this kind of a world is countercultural.

The practice of gratitude may not change our professional guilds, but it will certainly relieve us of our narcissism long enough to realize that all of us are dependent people. Our scholarship rests upon the work of those scholars that we hope to expand upon or dismantle. Our careers are made by the generosity of article and book referees, grant reviewers, search committees, and tenure committees. We can all name teachers and mentors who took the time to encourage us, offer advice, and write us letters. Gratitude may even do wonders for our mental health. Studies have shown that grateful people are usually less stressed, anxious, and depressed.

This Thanksgiving take some time to express gratitude. In a recent study the Harvard University sociologist Neil Gross concluded that more college and university professors believe in God than most academics ever realized. If this is true, then for some of us gratitude might come in the form of a prayer. For others it may be a handwritten note of appreciation to a senior scholar whom we normally contact only when we need a letter of recommendation. Or, as the semester closes, it might be a kind word to a student whose academic performance and earnest pursuit of the subject at hand has enriched our classroom or our intellectual life. Or perhaps a word of thanks to the secretary or assistant who makes our academic life a whole lot easier.

As the German theologian and Christian martyr Dietrich Bonhoeffer explained, “gratitude changes the pangs of memory into a tranquil joy.”

Author/s: 
John Fea
Author's email: 
newsroom@insidehighered.com

John Fea teaches American history at Messiah College, in Grantham, Pa. He is the author of The Way of Improvement Leads Home: Philip Vickers Fithian and the Rural Enlightenment in America (University of Pennsylvania Press, 2008).

Liberal Arts II: The Economy Requires Them

Many of us committed to the liberal arts have been defensive for as long as we can remember.

We have all cringed when we have heard a version of the following joke: The graduate with a science degree asks, “Why does it work?”; the graduate with an engineering degree asks, “How does it work?”; the graduate with a liberal arts degree asks, “Do you want fries with that?”

We have responded to such mockery by proclaiming the value of the liberal arts in the abstract: it creates a well-rounded person, is good for democracy, and develops the life of the mind. All these are certainly true, but somehow each misses the point that the joke drives home. Today’s college students and their families want to see a tangible financial outcome from the large investment that is now American higher education. That doesn’t make them anti-intellectual, but simply realists. Outside of home ownership, a college degree might be the largest single purchase for many Americans.

There is a disconnect as parents and students worry about economic outcomes when too many of us talk about lofty ideals. More families are questioning both the sticker price of schools and the value of whole fields of study. It is natural in this environment for us to feel defensive. It is time, however, that we in the liberal arts understand this new environment, and rather than merely react to it, we need to proactively engage it. To many Americans the liberal arts have a luxury they feel they need to give up to make a living -- nice but impractical. We need to speak more concretely to the economic as well as the intellectual value of a liberal arts degree.

The liberal arts always situate graduates on the road for success. More Fortune 500 CEOs have had liberal arts B.A.s than professional degrees. The same is true of doctors and lawyers. And we know the road to research science most often comes through a liberal arts experience. Now more than ever, as employment patterns seem to be changing, we need to engage the public on the value of a liberal arts degree in a more forceful and deliberate way.

We are witnessing an economic shift that may be every bit as profound as the shift from farm to factory. Today estimates are that over 25 percent of the American population is working as contingent labor -- freelancers, day laborers, consultants, micropreneurs.

Sitting where we do it is easy to dismiss this number because we assume it comes from day laborers and the working class, i.e., the non-college-educated. But just look at higher education's use of adjuncts and you see the trend. The fastest-growing sector of this shift is in the formally white-collar world our students aspire to. This number has been steadily rising and is projected to continue its upward climb unchanged. We are living in a world where 9:00-5:00 jobs are declining, careers with one company over a lifetime are uncommon, and economic risk has shifted from large institutions to individuals. Our students will know a world that is much more unstable and fluid than the one of a mere generation ago.

We have known for many years that younger workers (i.e., recent college graduates) move from firm to firm, job to job and even career to career during their lifetime. What we are seeing now, however, is different. And for as many Americans, they are hustling from gig to gig, too. These workers, many our former students, may never know economic security, but they may know success. For many of the new-economy workers, success is measured by more than just money, as freedom, flexibility and creativity count too.

If this is the new economy our students are going to inherit, we as college and university administrators, faculty and staff need to take stock of the programs we offer (curricular as well as extracurricular) to ensure that we serve our students' needs and set them on a successful course for the future. The skills they will need may be different from those of their predecessors. Colleges and universities with a true culture of assessment already are making the necessary strategic adjustments.

In 1956, William Whyte, the noted sociologist, wrote The Organizational Man to name the developing shift in work for that generation. Whyte recognized that white-collar workers traded independence for stability and security. What got them ahead in the then-new economy was the ability to fit in (socialization) and a deep set of narrow vocational skills. Firms at the time developed career ladders, and successful junior executives who honed their skills and got along advanced up the food chain.

Today, no such career ladder exists. And narrow sets of skills may not be the ticket they once were. We are witnessing a new way of working developing before our eyes. Today, breadth, cultural knowledge and sensitivity, flexibility, the ability to continually learn, grow and reinvent, technical skills, as well as drive and passion, define the road to success. And liberal arts institutions should take note, because this is exactly what we do best.

For liberal arts educators, this economic shift creates a useful moment to step out of the shadows. We no longer need to be defensive because what we have to offer is now more visibly useful in the world. Many of the skills needed to survive and thrive in the new economy are exactly those a well-rounded liberal arts education has always provided: depth, breadth, knowledge in context and motion, and the search for deeper understanding.

It will not be easy to explain to future students and their parents that a liberal arts degree may not lead to a particular “job” per se, because jobs in the traditional sense are disappearing. But, we can make a better case about how a liberal arts education leads to both a meaningful life and a successful career.

In this fluid world, arts and sciences graduates may have an advantage. They can seek out new opportunities and strike quickly. They are innovative and nimble. They think across platforms, understand society and culture, and see technology as a tool rather than an end in itself. In short, liberal arts graduates have the tools to make the best out of the new economy. And, above all, we need to better job identifying our successes, our alumni, as well as presenting them to the public. We need to ensure that the public knows a liberal arts degree is still, and always has been, a ticket to success.

This could be a moment for the rebirth of the liberal arts. For starters, we are witnessing exciting new research about the economy that is situating the discussion more squarely within the liberal arts orbit, and in the process blurring disciplinary boundaries. These scholars are doing what the American studies scholar Andrew Ross has called “scholarly reporting,” a blend of investigative reporting, social science and ethnography, as a way to understand the new economy shift. Scholars such as the sociologists Dalton Conley and Sharon Zurkin and the historian Bryant Simon offer new models of engaged scholarship that explain the cultural parameters of the new economy. We need to recognize and support this research because increasingly we will need to teach it as the best way to ensure our students understand the moment.

We also need to be less territorial, and recognize that the professional schools are not the enemy. They have a lot to offer our students. Strategic partnerships between professional schools and the arts and sciences enrich both and offer liberal arts students important professional opportunities long closed off to them. We also need to find ways to be good neighbors to the growing micropreneurial class, either by providing space, wifi, or interns. Some schools have created successful incubators, which can jump-start small businesses and give their students important ground-floor exposure to the emerging economy.

Today’s liberal arts graduates will need to function in an economy that is in some ways smaller. Most will work for small firms and many will simply work on their own. They will need to multitask as well as blend work and family. And, since there will be little budget or time for entry-level training, we need to ensure that all our students understand the basics of business even if they are in the arts. We also might consider preparing our graduates as if they were all going to become small business owners, because in a sense many of them are going to be micropreneurs.

Author/s: 
Richard A. Greenwald
Author's email: 
newsroom@insidehighered.com

Richard A Greenwald is dean of the Caspersen School of Graduate Studies, director of university partnerships, and professor of history at Drew University in Madison, N.J. His next book is entitled The Micropreneurial Age: The Permanent Freelancer and the New American (Work)Life.

Liberal Arts I: They Keep Chugging Along

When the economy goes down, one expects the liberal arts -- especially the humanities -- to wither, and laments about their death to go up. That’s no surprise since these fields have often defined themselves as unsullied by practical application. This notion provides little comfort to students -- and parents -- who are anxious about their post-college prospects; getting a good job -- in dire times, any job -- is of utmost importance. (According to CIRP’s 2009 Freshman Survey, 56.5 percent of students -- the highest since 1983 -- said that “graduates getting good jobs” was an important factor when choosing where to go to college.)

One expects students, then, to rush to courses and majors that promise plenty of entry-level jobs. Anticipating this, college administrators would cut back or eliminate programs that are not “employment friendly,” as well as those that generate little research revenue. Exit fields like classics, comparative literature, foreign languages and literatures, philosophy, religion, and enter only those that are preprofessional in orientation. Colleges preserving a commitment to the liberal arts would see a decline in enrollment; in some cases, the institution itself would disappear.

So runs the widespread narrative of decline and fall. Everyone has an anecdote or two to support this story, but does it hold in general and can we learn something from a closer examination of the facts?

The National Center for Education Statistics reports that the number of bachelor's degrees in “employment friendly” fields has been on the rise since 1970. Undergraduate business degrees -- the go-to “employment friendly” major -- has increased from 1970-71, with 115,400 degrees conferred, to 2007-08, with 335,250 conferred. In a parallel development, institutions graduated seven times more communications and journalism majors in 2007-08 than in 1970-71. And while numbers are small, there has been exponential growth in “parks, recreation, leisure, and fitness studies,” “security and protective services,” and “transportation and materials moving” degrees. Computer science, on the other hand, peaked in the mid-80s, dropped in the mid-90s, peaked again in the mid-2000s, and dropped again in the last five years.

What has students’ turn to such degrees meant for the humanities and social sciences? A mapping of bachelor degrees conferred in the humanities from 1966 to 2007 by the Humanities Indicator Project shows that the percentage of such majors was highest in the late 1960s (17-18 percent of all degrees conferred), low in the mid-1980s (6-7 percent), and more or less level since the early 1990s (8-9 percent). Trends, of course, vary from discipline to discipline.

Degrees awarded in English dropped from a high of 64,627 in 1970-71 to half that number in the early 1980s, before rising to 55,000 in the early 1990s and staying at that level since then. The social sciences and history were hit with a similar decline in majors in 1970s and 1980s, but then recovered nicely in the years since then and now have more than they did in 1970. The numbers of foreign language, philosophy, religious studies, and area studies majors have been stable since 1970. IPEDS data pick up where the Humanities Indicator Project leaves off and tell that in 2008 and 2009, the number of students who graduated with bachelor's degrees in English, foreign language and literatures, history, and philosophy and religion have remained at the same level.

What’s surprising about this bird’s-eye view of undergraduate education is not the increase in the number of majors in programs that should lead directly to a job after graduation, but that the number of degrees earned in the humanities and related fields have not been adversely affected by the financial troubles that have come and gone over the last two decades.

Of course, macro-level statistics reveal only part of the story. What do things look like at the ground level? How are departments faring? Course enrollments? Majors? Since the study of the Greek and Roman classics tends to be a bellwether for trends in the humanities and related fields (with departments that are small and often vulnerable), it seemed reasonable to ask Adam Blistein of the American Philological Association whether classics departments were being dropped at a significant number of places. “Not really” was his answer; while the classics major at Michigan State was cut, and a few other departments were in difficulty, there was no widespread damage to the field -- at least not yet.

Big declines in classics enrollments? Again, the answer seems to be, “Not really.” Many institutions report a steady gain in the number of majors over the past decade. Princeton’s classics department, for example, announced this past spring 17 graduating seniors, roughly twice what the number had been three decades ago. And the strength is not just in elite institutions. Charles Pazdernik at Grand Valley State University in hard-hit Michigan reported that his department has 50+ majors on the books and strong enrollments in language courses.

If classics seems to be faring surprisingly well, what about the modern languages? There are dire reports about German and Russian, and the Romance languages seem increasingly to be programs in Spanish, with a little French and Italian tossed in. The Modern Language Association reported in fall 2006 -- well before the current downturn -- a 12.9 percent gain in language study since 2002. This translates into 180,557 more enrollments. Every language except Biblical Hebrew showed increases, some exponential -- Arabic (126.5 percent), Chinese (51 percent), and Korean (37.1 percent) -- while others less so -- French (2.2 percent), German (3.5 percent), and Russian (3.9 percent). (Back to the ancient world for a moment: Latin saw a 7.9 percent increase, and ancient Greek 12.1 percent). The study of foreign languages, in other words, seems not to be disappearing; the mix is simply changing.

Theoretical and ideological issues have troubled and fragmented literature departments in recent years, but a spring 2010 conference on literary studies at the National Humanities Center suggests that the field is enjoying a revitalization. The mood was eloquent, upbeat, innovative; no doom and gloom, even though many participants were from institutions where painful budget cuts had recently been made.

A similar mood was evident at National Forum on the Future of Liberal Education, a gathering of some highly regarded assistant professors in the humanities and social sciences this past February. They were well aware that times were tough, the job market for Ph.D.s miserable, and tenure prospects uncertain. Yet their response was to get on with the work of strengthening liberal education, rather than bemoan its decline and fall. Energy was high, and with it the conviction that the best way to move liberal education forward was to achieve demonstrable improvements in student learning.

It’s true that these young faculty members are from top-flight universities. What about smaller, less well-endowed institutions? Richard Ekman of the Council of Independent Colleges reports that while a few of the colleges in his consortium are indeed in trouble, most were doing quite well, increasing enrollments and becoming more selective.
And what about state universities and land grant institutions, where most students go to college? Were they scuttling the liberal arts and sciences because of fierce cutbacks? David Shulenburger of the Association of Public and Land-grant Universities says that while budget cuts have resulted in strategic “consolidation of programs and sometimes the elimination of low-enrollment majors,” he does not “know of any public universities weakening their liberal education requirements.”

Mark Twain once remarked that reports of his death were greatly exaggerated. The liberal arts disciplines, it seems, can say the same thing. The on-the-ground stories back up the statistics and reinforce the idea that the liberal arts are not dying, despite the soft job market and the recent recession. Majors are steady, enrollments are up in particular fields, and students -- and institutions -- aren’t turning their backs on disciplines that don’t have obvious utility for the workplace. The liberal arts seem to have a particular endurance and resilience, even when we expect them to decline and fall.

One could imagine any number of reasons why this is the case -- the inherent conservatism of colleges and universities is one -- but maybe something much more dynamic is at work. Perhaps the stamina of the liberal arts in today’s environment draws in part from the vital role they play in providing students with a robust liberal education, that is, a kind of education that develops their knowledge in a range of disciplinary fields, and importantly, their cognitive skills and personal competencies. The liberal arts continue -- and likely will always -- give students an education that delves into the intricate language of Shakespeare or Woolf, or the complex historical details of the Peloponnesian War or the French Revolution. That is a given.

But what the liberal arts also provide is a rich site for students to think critically, to write analytically and expressively, to consider questions of moral and ethical importance (as well as those of meaning and value), and to construct a framework for understanding the infinite complexities and uncertainties of human life. This is, as many have argued before, a powerful form of education, a point that students, the statistics and anecdotes show, agree with.

Author/s: 
W. Robert Connor and Cheryl Ching
Author's email: 
info@insidehighered.com

W. Robert Connor is the former president of the Teagle Foundation, to which he is now a senior adviser. Cheryl Ching is a program officer at Teagle.

Putting the 'Humanities' in 'Digital Humanities'

Reflecting on the recent The Humanities and Technology conference (THAT Camp) in San Francisco, what strikes me most is that digital humanities events consistently tip more toward the logic-structured digital side of things. That is, they are less balanced out by the humanities side. But what I mean by that itself has been a problem I've been mulling for some time now. What is the missing contribution from the humanities?

I think this digital dominance revolves around two problems.

The first is an old problem. The humanities’ pattern of professional anxiety goes back to the 1800s and stems from pressure to incorporate the methods of science into our disciplines or to develop our own, uniquely humanistic, methods of scholarship. The "digital humanities" rubs salt in these still open wounds by demonstrating what cool things can be done with literature, history, poetry, or philosophy if only we render humanities scholarship compliant with cold, computational logic. Discussions concern how to structure the humanities as data.

The showy and often very visual products built on such data and the ease with which information contained within them is intuitively understood appear, at first blush, to be a triumph of quantitative thinking. The pretty, animated graphs or fluid screen forms belie the fact that boring spreadsheets and databases contain the details. Humanities scholars, too, often recoil from the presumably shallow grasp of a subject that data visualization invites.

For many of us trained in the humanities, to contribute data to such a project feels a bit like chopping up a Picasso into a million pieces and feeding those pieces one by one into a machine that promises to put it all back together, cleaner and prettier than it looked before.

Which leads to the second problem, the difficulty of quantifying an aesthetic experience and — more often — the resistance to doing so. A unique feature of humanities scholarship is that its objects of study evoke an aesthetic response from the reader (or viewer). While a sunset might be beautiful, recognizing its beauty is not critical to studying it scientifically. Failing to appreciate the economy of language in a poem about a sunset, however, is to miss the point.

Literature is more than the sum of its words on a page, just as an artwork is more than the sum of the molecules it comprises. To itemize every word or molecule on a spreadsheet is simply to apply more anesthetizing structure than humanists can bear. And so it seems that the digital humanities is a paradox, trying to combine two incompatible sets of values.

Yet, humanities scholarship is already based on structure: language. "Code," the underlying set of languages that empowers all things digital, is just another language entering the profession. Since the application of digital tools to traditional humanities scholarship can yield fruitful results, perhaps what is often missing from the humanities is a clearer embrace of code.

In fact, "code" is a good example of how something that is more than the sum of its parts emerges from the atomic bits of text that logic demands must be lined up next to each other in just such-and-such a way. When well-structured code is combined with the right software (e.g., a browser, which itself is a product of code), we see William Blake’s illuminated prints, or hear Gertrude Stein reading a poem, or access a world-wide conversation on just what is the digital humanities. As the folks at WordPress say, code is poetry.

I remember 7th-grade homework assignments programming onscreen fireworks explosions in BASIC. When I was in 7th grade, I was willing to patiently decipher code only because of the promise of cool graphics on the other end. When I was older, I realized the I was willing to read patiently through Hegel and Kant because I learned to see the fireworks in the code itself. To avid readers of literature, the characters of a story come alive to us, laying bare our own feelings or moral inclinations in the process.

Detecting patterns, interpreting symbolism, and analyzing logical inconsistencies in text are all techniques used in humanities scholarship. Perhaps the digital humanities' greatest gift to the humanities can be the ability to invest a generation of "users" in the techniques and practiced meticulous attention to detail required to become a scholar.

Author/s: 
Phillip Barron
Author's email: 
newsroom@insidehighered.com

Trained in analytic philosophy, Phillip Barron is a digital history developer at the University of California at Davis.

Historians Honor Picket Line

Smart Title: 
Honoring a labor boycott, the Organization of American Historians is moving its annual meeting from San Francisco to San Jose.

'From Concentration Camp to Campus'

Smart Title: 

The internment of Japanese Americans in World War II remains a shameful episode in American history. In From Concentration Camp to Campus: Japanese American Students and World War II (University of Illinois Press), Allan W. Austin focuses on a positive event during the internments. More than 4,000 college students were allowed to leave the camps to enroll in colleges -- provided that the colleges would accept them and were not on the West Coast.

3 Historians Win Bancrofts

Smart Title: 
Columbia prizes honor works on antebellum Virginia, race and the Supreme Court, and Southern intellectualism.

Bad Scholarship, Bad Politics or Bad Luck?

Smart Title: 
2 papers dispute the conventional wisdom about a recent series of scandals involving historians.

Pages

Subscribe to RSS - History
Back to Top