We hear these days of the "crisis of the humanities." The number of majors, jobs, and student interest in these subjects is dropping. The Boston Globe offered one report on the worries of the humanities in an article last year about the new Mandell Center at Brandeis University. The Globe asserted, "At college campuses around the world, the humanities are hurting. Students are flocking to majors more closely linked to their career ambitions. Grant money and philanthropy are flowing to the sciences. And university presidents are worried about the future of subjects once at the heart of a liberal arts education."
Such gloom must be placed in context. Doubts about the humanities have been around at least since Aristophanes wrote The Clouds. The playwright claimed that if a man engaged in the "new" Socratic form of teaching and questioning, he could wind up with big genitals (apparently seen as a negative side effect) due to a loss of self-control. But the Socratic humanities survived, in spite of the execution of their founder, through the schools of his intellectual son and grandson -- the Academy of Plato and the Lyceum of Aristotle.
I don't think that the humanities are really in a crisis, though perhaps they have a chronic illness. Bachelor's degrees in the humanities have held relatively steady since 1994 at roughly 12-13 percent of all majors. Such figures demonstrate that the health of the humanities is not robust, as measured in terms of student preferences. In contrast, the number of undergraduate business majors is steadily and constantly increasing.
So what has been the response of university and college leaders to the ill health of the humanities?
It has been to declare to applicants, students, faculty, and the public that these subjects are important. It has included more investments in humanities, from new buildings like the Mandel Center at Brandeis, to, in some cases, hiring more faculty and publicizing the humanities energetically. Dartmouth College's president, Jim Yong Kim, recently offered the hortatory remark that "Literature and the arts should not only be for kids who go to cotillion balls to make polite conversation at parties."
I couldn't agree more with the idea that the humanities are important. But this type of approach is what I call the "eat it, it's good for you" response to the curricular doldrums of humanities. That never worked with my children when it came to eating broccoli and it is even less likely to help increase humanities enrollments nationally today.
The dual-horned dilemma of higher education is the erosion of the number of majors in the humanities on the one hand and the long-feared "closing of the American mind" on the other, produced in part by the growing number of students taking what some regard as easy business majors. Yet these problems can only be solved by harnessing the power of culture, by understanding the ethno-axiological soup from which curriculums evolve and find their sustenance. Jerome Bruner has long urged educators to connect with culture, to recognize that the environment in which we operate is a value-laden behemoth whose course changes usually consume decades, a creature that won't be ignored.
It is also vital that we of the humanities not overplay our hands and claim for ourselves a uniqueness that we do not have. For example, it has become nearly a truism to say that the humanities teach "critical thinking skills." This is often correct of humanities instruction (though certainly not universally so). But critical thinking is unique neither to the humanities nor to the arts and sciences more generally. A good business education, for example, teaches critical thinking in management, marketing, accounting, finance, and other courses. More realistically and humbly, what we can say is that the humanities and sciences provide complementary contexts for reasoning and cultural knowledge that are crucial to functioning at a high level in the enveloping society.
Thus, admitting that critical thinking can also be developed in professional schools, we realize that it is enhanced and further developed when the thinker learns to develop analytical skills in history, different languages, philosophy, mathematics, and other contexts. The humanities offer a distinct set of problems that hone thinking skills, even if they are not the only critical thinking game in town. At my institution, Bentley University, and other institutions where most students major in professional fields, for example, English develops vocabulary and clarity of expression while, say, marketing builds on and contributes to these. Science requires empirical verification and consideration of alternatives. Accountancy builds on and contributes to these. Science and English make better business students as business courses improve thinking in the humanities and sciences.
If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That's like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.
So what is there to do? How do we harness the power of culture to revive and heal the influence of the humanities on future generations? Remember, Popeye didn't eat his spinach only because it was good for him. He ate his spinach because he believed that it was a vital part of his ability to defend himself from the dangers and vicissitudes of life, personified in Bluto. And because he believed that it would give him a good life, represented by Olive Oyl.
Recently, an alumnus of Bentley told me over dinner, "You need business skills to get a job at our firm. But you need the arts and sciences to advance." Now, that is the kind of skyhook that the friends of the humanities need in order to strengthen their numbers, perception, and impact.
While I was considering the offer to come to Bentley as its next dean of arts and sciences, Brown University and another institution were considering me for professorial positions. Although I felt honored, I did not want to polish my own lamp when I felt that much in the humanities and elsewhere in higher education risk becoming a Ponzi scheme, which Wikipedia defines accurately as an "...operation that pays returns to separate investors, not from any actual profit earned by the organization, but from their own money or money paid by subsequent investors."
I wanted to make my small contribution to solving this problem, so I withdrew from consideration for these appointments to become an administrator and face the issue on the front line. And Bentley sounded like exactly the place to be, based on pioneering efforts to integrate the humanities and sciences into professional education -- such as our innovative liberal studies major, in which business majors complete a series of courses, reflections, and a capstone project emerging from their individual integration of humanities, sciences, and business.
Programs that take in students without proper concern for their future or provision for post-graduate opportunities -- how they can usewhat they have learned in meaningful work-- need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.
The cultural zeitgeist requires of education that it be intellectually well-balanced and focused but also useful. Providing all of these and more is not the commercialization of higher education. Rather, the combination of professional education and the humanities and sciences is an opportunity to at once (re-)engage students in the humanities and to realize Dewey's pragmatic goal of transforming education by coupling concrete objectives with abstract ideas, general knowledge, and theory.
I have labeled this call for a closer connection between the humanities and professional education the "Crucial Educational Fusion." Others have recognized this need, as examples in the new Carnegie Foundation for the Advancement of Teaching bookRethinking Undergraduate Business Education: Liberal Learning for the Profession illustrate. This crucial educational fusion is one solution to the lethargy of the humanities -- breaking down academic silos, building the humanities into professional curriculums, and creating a need for the humanities. Enhancing their flavor like cheese on broccoli.
Daniel L. Everett
Daniel L. Everett is dean of arts and sciences at Bentley University.
How universities are organized can confuse not only the sympathetic, casual observer of higher education but students and staff members as well.
One campus has a college of arts and sciences, another has separate colleges of sciences, humanities and social science. Microbiology can be in the college of natural resources and environment at one place, and in the school of sciences or the medical school somewhere else. Modern foreign languages appear organized in departments that encompass all of the modern foreign languages and their literatures, in departments devoted to Spanish and Portuguese, French and Italian, or other combinations.
Insiders know, however, that all of these organizational permutations reflect not only significant changes in the universe of knowledge but also internal structures of personality, politics, money and power as well as the external pressures of fad, fashion or funding. Academic reorganization is a frequent exercise on university campuses, and often generates tremendous controversy because each effort signifies a potential for gain or loss in academic positioning for money, power and prestige.
Although, to outsiders, the warfare that these reorganizations frequently provoke can often appear out of proportion to the stakes involved, insiders know that organizational structure can influence internal distributions of resources. Even more importantly for many faculty and students, the organizational structure serves as a prestige map.
Reorganizations that adjust the boundaries of campus subunits are among the most complicated of issues because often reorganization is a good and effective thing while in other cases that look almost the same, it is a scam. Reorganization as an internal political exercise occurs frequently, but so too do readjustments to reflect the expansion and redefinition of knowledge. Separating the substantive from the political requires some careful observation.
For example, the development of a subdiscipline into a major field of study is a complex and fascinating process that produces new departments such as computer science or biomedical engineering. The emergence of new departments or academic guilds follows the development of specific intellectual domains with their own methodology, journals, research agenda, and definition of the particular intellectual skills required to advance knowledge in that area.
The academic guilds eventually determine what new fields have reached sufficient maturity of methodology and intellectual focus to warrant separate status as departments, with the attendant definition of a specific set of requirements for the Ph.D. and often a particular pattern of courses for an undergraduate major. Often national funding agencies and research foundations help advance these changes by supporting research based in defined departments that can give the new research direction and continuity.
Although these intellectual advances often produce some controversy about the point at which a subfield deserves to recognition as a major discipline with its own department, much of the controversy turns on legitimate intellectual issues of methodology and academic substance. These represent significant efforts to readjust the academic world to match advances in knowledge and the organization of scholarship.
Other reorganizations represent mostly varieties of academic game playing. They reflect much less academic substance and instead turn on issues of politics, power, prestige and money.
The game often takes place in shadow form, with highly evolved intellectual arguments that underneath speak to the issues of prestige and money. If one department consolidates with another, the loss in academic status for the members of this consolidated unit can be devastating. Similarly, if a field gains separate bureaucratic status as an independent department, a substantial status gain results. It is much better to be a department of Spanish than a field within a department of Romance languages. It is much better to be a school of journalism than a department of the College of Arts and Sciences. The goal of these organizational transformations is for subgroups of like-minded faculty to have a seat at the institutional table for the distribution of resources, rather than to suffer the risk of having someone less sympathetic to their particular subdiscipline speak for them.
Other organizational anomalies reflect historical, accidental or opportunistic events. Some institutions, concerned that the traditional arts and sciences reflected a domain too large for effective administration, divided the disciplines into subgroups: humanities, social sciences, and sciences or some variation. In such cases, departments like history reside within either the humanities or the social sciences, depending on the intellectual fashion of historians at the time of reorganization.
Business schools can acquire business-like units, and a management school at one institution may include such programs as sports and hospitality management while in another these programs reside in colleges of human performance or continuing education or in separate freestanding schools of hospitality management. Music departments live within colleges of humanities and fine arts or exist as separate schools of their own depending on their size, their focus on performance as opposed to theory or history, and the accidents of their original founding.
Many campus leaders take on reorganization projects to try to align the bureaucratic structure of units with a clear sense of the institution’s academic mission. These efforts can provide a major focus of engagement for the campus, occupy faculty task forces and councils in heady debate, and then, after an extended period, produce a new organizational matrix.
The value of such reorganization varies. Sometimes reorganization can reduce the fragmentation of the campus produced by prior political warfare, consolidate micro-administrative units, and achieve some economies of scale in staff and management. In other cases, the reorganization simply serves to distract the campus from the need to work harder, better and more competitively. Reorganization changes take much time and energy and often substitute for the real work of requiring performance from the units. Reorganization is also a highly visible form of executive leadership that places senior administrators in publicity rewarding, take-charge roles.
The beauty of a reorganization initiative in this context is that it has no measurable outcome. No one has an obligation to demonstrate that the new organization is more effective than the old one, and even if it is more effective, the results will not appear for several years. Reorganization achieves the appearance of significant administrative leadership without an obligation to deliver any improvement in the quality or productivity of teaching or research. And refocuses everyone inward on the internal competition for position, place and money, diverting attention from the necessity of competing against the outside marketplaces of higher education.
Other reorganizations, however, follow the money. In cases where a particular subunit of a campus becomes remarkably successful at attracting external funding, a frequent result is a reorganization that gives the highly successful unit separate bureaucratic identity. Sometimes this occurs through the invention of institutes and centers, which are holding places for academic entrepreneurial success. In other cases, subunits of traditional departments or programs become independent departments, such as polymer sciences or legal studies. A music department can acquire external resources, hire nationally preeminent faculty, and emerge as a freestanding music school. A journalism department can expand its scale through grants, external programs, and fund raising and break free from a college of arts and sciences to become its own school.
For those conversant in the internal political dynamics of universities, the organizational chart of departments, schools, and colleges, and the list of centers and institutes, serve as a guide to the political history of the campus’s intellectual enterprise. By reviewing this chart, a newcomer acquires a sense of the relative political power and intellectual and financial muscle of the various campus units.
University systems also have their own particular and peculiar organizational structures that they revise and reorder frequently, also in response to political and fiscal pressures of various kinds, but that is a topic for another day.
The dreaded question: “So, what are you teaching this semester?” When I reply that I teach a business ethics course, more often than not my questioner laughs and asks whether that isn’t an oxymoron. And then laughs some more.
So it is hardly surprising that the recent cheating scandal at Duke University’s business school has fueled cynicism about the teaching of business ethics. Business schools across the country responded to corporate wrongdoing over the last decade by emphasizing ethics within their curriculums. In the daytime M.B.A. at Duke, students are required to take “Leadership, Ethics and Organizations” as part of an initial three-week summer term. Yet close to 10 percent of first-year students in Duke’s M.B.A. program were suspected of cheating on a take-home examination. The collective laughter would have been greater only if the accused students were in one of Duke’s ethics courses.
Still we should be careful not to infer too much from the Duke cheating scandal. A successful ethics component within a business program does not guarantee that its participants will never behave immorally. Not even churches or prisons boast that kind of effectiveness. So why should we expect it of an ethics class? What we expect is that when students complete the ethics component, they will approach moral problems with greater thoughtfulness and intellectual sophistication, as well as be more likely to resolve these problems in the right way. The goal is improvement, not perfection.
The behavior of the Duke M.B.A. students nevertheless gives us reason to pause. How much thoughtfulness and intellectual sophistication are necessary to know that cheating is wrong? Surely these young professionals did not need an ethics class to garner this important piece of moral knowledge. But if the students were aware of the wrongness of cheating all along, what kind of knowledge were they missing? What, exactly, could they have been taught in business ethics?
There is something more for business students to learn in ethics classes, and throughout their business programs. Ethics is not just about the what of morality; it is also about the whom of morality.
In ethics, the general requirements -- the what of morality -- are often quite straightforward. Indeed we would be hard pressed to find anyone in our society, let alone a university-level student, who was unaware of the general prohibition on cheating. However, the application of these requirements to individuals -- the whom of morality -- can be significantly murkier. I dare say it would not be difficult at all to find students who genuinely believe that their circumstances justify them in violating the prohibition on cheating.
Doing the right thing in the Duke case therefore required two things. First, the M.B.A. students needed to know that cheating is generally morally wrong. Second, they needed to know that it was wrong for them to cheat in their particular circumstances.
Why do people sometimes believe that moral requirements do not apply to them in the situations they face? The most compelling answer appeals to consequences. People predict that breaking the rules will have high payoffs. And where are the opposing costs? After all, rule-breaking really doesn’t seem to hurt anyone else, especially in environments in which others similarly break the rules. Of course the rules of morality generally ought to be followed. But only as long as the costs aren’t too high.
The consequentialist logic of business education may encourage this kind of thinking. There is no mistaking the fact that profit maximization is the chief value within many business curriculums. As a result, brief surveys of business law, discussions of company codes of conduct, or even introductions to ethical theory -- the what of morality -- are very likely to buckle under the comparative weight given to considerations of profit, goal achievement, cost-benefit analysis, and shareholder satisfaction.
Does this mean that business ethics really is an oxymoron? Not if business schools are willing to take the whom of morality seriously and educate students throughout the curriculum about the application of ethical requirements to all business actors. Among other things, this kind of education would draw on traditional academic disciplines such as philosophy, psychology, and politics to help students understand their place in the world and the role of business in society.
Ultimately, business ethics requires that we rethink the business curriculum. Business is not a closed system with its own set of values, motivations, and rules. The curriculum should reflect this fact. First, students must be able to think deeply and critically about conflicts between wealth and other values. Second, students should know more about ordinary human psychology, especially the tendency to overestimate our own importance and the importance our goals. Third, students need a greater awareness of the interdependence of business and the rest of civil society. Unfortunately, students cannot get this kind of education from a curriculum that focuses only on the business “fundamentals.”
So it is not enough for business students to hear yet again that certain behaviors are generally prohibited by morality. They must also come to see that these prohibitions apply to them even when morality conflicts with self-interest, the bottom line, and the interests of investors.
When business schools start taking ethics seriously, maybe people will stop laughing.
Terry L. Price
Terry L. Price is visiting associate professor of philosophy and a fellow at the Parr Center for Ethics at the University of North Carolina at Chapel Hill. He is associate professor at the University of Richmond’s Jepson School of Leadership Studies and author of Understanding Ethical Failures of Leadership (Cambridge University Press).
I’ve spent most of my career other than where I should be. I’m a business professor teaching at a liberal arts college. When I walk outside my office door, I’m more likely to bump into a colleague discussing Buddhism or chaos theory than one who’s talking about the latest Academy of Management conference.
But having an unconventional career can engender an uncommon freedom -- the chance to think about things most others regard as settled. Bringing business education and the liberal arts into close proximity, as happens at many small liberal arts colleges today, can unsettle the assumptions of each. But if done well -- and it takes a serious commitment to do it well -- there are tremendous benefits, certainly to business education and, surprisingly to many, also to the liberal arts.
The key to a business program flourishing at a liberal arts college is threefold: blending, bridging, and building. Blending entails teaching traditional business subjects from a liberal arts perspective. Bridging involves connecting the content of business classes to other disciplines. Building entails using the innovative study of business to develop and enrich the broader liberal arts curriculum itself. This occurs when the distinctive nature of business programs prompts larger questions about the nature of liberal education. (I have adopted the terms “blending” and “bridging” here from my fellow scholars, E. Byron Chew and Cecilia McInnis-Bowers.)
Teaching traditional business subjects from a liberal arts perspective involves recognizing the hidden biases that can inhere in the instruction of core business functions, from marketing to accounting to management, when such subjects are taught in a purely technical manner. Such biases can take many forms. They can involve an uncritical acceptance of a particular goal, such as occurs when finance theory stipulates the maximization of shareholder wealth as the sole end of companies, notwithstanding several real-world examples to the contrary.
Biases can also happen when business courses explore central domains of commerce from a single vantage point rather than from multiple perspectives. Conventional business majors, for instance, are much more likely to include courses in marketing than in consumer protection, even though the introduction of goods and services into society can hardly be fully understood if viewed only from the perspective of producers. Further, business courses can subtly convey biases by failing to contextualize adequately for students their basic inquiries. Traditional courses in organizational behavior, for example, draw heavily from psychology, yet often leave the assumptions of psychological theory unexplored. The danger here is that students uncritically adopt understandings of the human psyche while believing they are merely learning the practical organizational dynamics of business firms.
Teaching business subjects from a liberal arts perspective thus requires professors to be cognizant of such traditional biases and to teach in ways that expose them as part of a larger dialogue. This means approaching business topics from a critical vantage point, engaging multiple perspectives, and richly contextualizing basic inquiries. This is what we are attempting in the introductory course to the major at my home institution, Franklin & Marshall College. Entitled “Organizing in the 21st Century: Theories of Organization,” the course focuses on traditional topics of strategic management, but does so critically, exploring alternative theories of work and organization. It engages the perspectives of the multiple stakeholders in our commercial world, from employees to managers to consumers to members of the larger community. It highlights the way many disciplines -- psychology, sociology, and anthropology, to cite only a few – provide frameworks that can illuminate our commercial lives.
There are a number of opportunities for connecting the content of business classes to classes offered in more traditional liberal arts disciplines. Such opportunities need not involve creating new courses. The necessary courses are frequently already established and successful. Some students are even making the connections on their own, as when a biology student with an interest in working for a pharmaceutical company seeks out some relevant business courses. But we owe it to students not to leave them without support in discovering and pursuing these rich connections.
As with the biology student, these connections may be ones that help a student develop his larger career aspirations. Even in the most traditional business programs, there is a need to make practical connections with courses in such areas as legal studies, environmental studies, and international studies. Such areas of study are intrinsically valuable to a straightforward business career, as business operates in an increasingly litigious society, becomes more environmentally conscious, and further internationalizes its operations.
But at least of equal value are the connective opportunities that can satisfy a student’s deeper intellectual curiosities that have arisen in the study of business – an interest in psychology prompted by the study of management, an interest in economic theory stimulated in a finance class, an interest in ethics engendered by seeing the conflicts of interests accountants face. The goal here should be to highlight and promote for students structured learning paths that encourage these sorts of avenues of inquiry. This can occur, informally, through simply enriching our advising of business students or, more formally, through the creation of curricular structures such as innovative minors.
Using the innovative study of business to develop and enrich the broader liberal arts curriculum is potentially the most far-reaching contribution a business program can make to a liberal arts college. For it involves raising larger questions about the nature of liberal education.
First, teaching business innovatively unsettles the way we have often thought of liberal education as arising only from the study of certain prescribed disciplines. Discipline-based notions of liberal education are prevalent today, even though the history of liberal education readily reveals its changing disciplinary nature. The natural sciences, for example, were not always an accepted part of the canon. Teaching business innovatively thus brings to the fore a basic, reoccurring issue for liberal education: Is the core of liberal arts instruction based on what we teach or how we teach?
Second, teaching business innovatively highlights the way in which we conventionally think about the distinction between basic and applied knowledge. We often conceive of applied knowledge in purely vocational terms, reserving for liberal study the pursuit of basic knowledge unfettered by constraining purposes. The innovative study of business unsettles this conventional dichotomy. For at the core of business study is the interplay between basic and applied knowledge. Thus, teaching business innovatively prompts a provocative question for liberal education: Does the application of knowledge diminish or deepen liberal education?
Third, teaching business innovatively spotlights the role of cross-disciplinary inquiry in liberal education. Those outside of business often forget business study involves the integration of a number of distinct academic disciplines. Accounting, marketing, finance, and management have their own distinctive set of knowledge bases, models, and assumptions – at least some of which are in tension with one another. One is likely to get a very different sense, for instance, of human motivation in a finance class than in a class on organizational behavior. With its successful integration of multiple academic disciplines, the study of business is a highly developed form of area studies. It thus poses in an especially cogent way the larger issue area studies raise for liberal education. Does the core of liberal education reside within disciplines or among them?
Liberal arts colleges that welcome the innovative teaching of business thus stand a better chance of addressing successfully such larger questions of liberal education. Of course, welcoming the study of our commercial lives into the world of liberal education is hardly the prevailing norm. It is rather, as this business professor read as an English major many years ago, taking the road “less traveled by.” But in my unconventional career, I can see how, it “has made all the difference.”
Jeffrey Nesteruk is chair of the Department of Business, Organizations, and Society at Franklin & Marshall College.
Every American decade has its archetypes. If you were heading off to business school in the 1980s, you might have wondered – or even worried – you’d end up like Alex Keaton from the hit TV series "Family Ties." Alex scoffed at the Peace Corps past of his parents, and believed he could amass all the wealth and status that he wanted without being too concerned about the political affairs of the world around him -- beyond, perhaps, advocating for lower tax rates on capital gains.
Today Alex would not survive, much less thrive, in a world marketplace where economic events in nearly every developing and industrialized nation can dramatically impact the fortunes of others. Growing affluence in China coupled with the rise of ethanol, for example, has increased the demand for meat, which drives up global grain prices. At the same time, instability in the Nigerian delta directly influences the price of oil in New York, and a small business in Germany could easily be denied a loan from a distressed local bank that has over-invested in mortgages in the United States. Meanwhile, as we’ve seen in just the past few weeks, the implosion of the U.S. financial system continues to send aftershocks to financial markets and economies across the globe.
Unfortunately, too few colleges or universities are preparing students to understand these global dynamics. According to the Center for International Initiatives at the American Council on Education, the percentage of colleges that require a course with an international or global focus as part of the general education curriculum fell from 41 percent in 2001 to 37 percent in 2006. And 27 percent of the nation’s colleges and universities have no students at all who study abroad. But even among the colleges and universities that do promote “semester abroad” programs, most offer these as add-ons to the required course of study, providing students with only a taste of life in another nation and a small selection of elective courses.
A far better approach would be to make international study a core component of undergraduate education in the 21st century -- requiring students to spend a significant portion of their college years abroad (e.g., two or more semesters) and do it while studying in multiple locations. Students would thereby be exposed to the interconnections across multiple countries and cultures, so they have the opportunity to gain insight into the complex economic and political factors shaping our world.
My certainty on the need for this approach has been influenced by 20 years of experience as a business school educator. As a professor at the University of Chicago in the 1990’s, I first observed the prevalence of a “free market ideology” among our first and second year M.B.A. students – a viewpoint that over-simplifies market dynamics and their impact on the social and political landscape. That’s when I first began to think about new models for undergraduate education that incorporate a deeper understanding of global economic dynamics and the interconnection between the private and public sectors.
Now, as dean at the New York University Stern School of Business’s Undergraduate College, I’ve worked with our faculty to create a new bachelor’s degree in business and political economy, designed to foster deeper understanding of the intersections between international business, politics and economics. Our new curriculum not only integrates these perspectives, but requires students to spend three semesters of global study on three different continents, where they experience the course of business in both industrialized and emerging market nations.
During their sophomore year in London, for example, students will study the foundations of economics and politics in Europe’s financial center, under the guidance of faculty from both NYU and local institutions. In Shanghai during the junior year, they will experience life in a developing country where commerce is thriving yet challenged by centuries of strict political rule. From there, they will travel to developing markets in India to gain a first-hand understanding of how a nation strives for capitalistic momentum despite having a large population of undereducated and underemployed citizens -- and how these converging factors of economics and politics will likewise impact India’s strength as a developing nation in the world marketplace.
Through the experience, the students will learn how markets, corporations, governments, religions and cultures converge in nations that are inextricably linked to the success of capitalism in the U.S. – an understanding that cannot be easily replicated without spending a significant amount of time living and learning in these nations.
While I recognize that NYU’s existing infrastructure and history of international education enhance our ability to create this type of experience, there are many other ways for colleges and universities to better open students’ eyes to the convergence between international markets, economies, cultures and governments. They can begin by weaving the subject matter into existing coursework, combining international economics and business courses with politics, sociology and religion courses.
They can also augment their current foreign exchange programs -- going beyond simply having students “visit” back and forth -- by investing in deeper, more elaborated partnerships. For example, colleges from different continents could invest in developing integrated curricula across two (or more) global partner institutions. So that when students study abroad at a partner campus they would have a more seamless academic experience, one that is specifically designed to promote deeper understanding of global economic, social and political issues. These programs could be supplemented by distance learning opportunities and the use of digital technology to connect students across partner campuses for virtual and collaborative learning experiences when back at their home campus.
While these recommendations may sound daunting, I would argue that moving undergraduate education in this direction is a social imperative. Given the ever-increasing connectedness of our complex world, students need to understand how political tensions, conflicting attitudes about globalization and religion, and the ever-expanding reach of free markets will impact worldwide security and the future of the global marketplace. And the best way to make that happen is to send them packing -- inspired and determined to understand the wonders of the world around them.
Sally Blount-Lyon is dean of the NYU Stern School of Business Undergraduate College, and special advisor to the provost for global academic integration at New York University.
President-elect Barack Obama has proposed replacing the Bowl Championship Series games with an eight-team playoff to determine the national college football champion in Division I-A. If his administration really has the time and inclination to deal with crises other than the national economic picture and our health care system, we would encourage him to focus on something more important than football: how American institutions of higher education are faring in the international education bowl.
Our national education game plan is in fact linked to our economic future. One crucial factor is how our colleges and universities raise and spend their money, for academics and for sports.
As two concerned members of the Coalition on Intercollegiate Athletics, a consortium of university faculty senates concerned about sports issues, we offer here our opinions drawn from our long up-close and personal perspectives on how big-time sports has affected the academic missions at our two universities: the University of Oregon and the University of Texas at Austin. U.T.'s athletics director is (in)famous for declaring its program the "Joneses" of NCAA athletics, with which all others must keep up. Longhorns Inc, as Texas Monthly called Texas athletics in its November issue, outspends all but a few competitors. It is one of the few college sports programs to make a profit. The University of Oregon has also moved into the top 25 in Division I-A football. We discuss the costs to both institutions below.
In September 2006, the U.S. Secretary of Education's Commission on the Future of Higher Education painted a jumbotron-scale picture of the failings and needs of our colleges and universities in math, science, engineering and even reading and writing. The final report, ironically entitled A Test of Leadership, highlighted a serious decline in U.S. student academic performance compared to other countries. Undeterred by these alarming data, university leaders have increased athletic spending, while academic programs suffer.
Simply put, in balancing the institutional priorities of athletics and academics, many university presidents are failing our country. But they are not alone. The University of Texas System Board of Regents, which oversees U.T., has authorized spending a quarter of a billion dollars on stadiums, practice fields and sports arena enhancements since 2003. Other Texas universities spent $750 million during this same time on sports facilities.
"So what?" you might say. "U.T. athletics is making money." We might say this too, if winning football games were the real business of our educational institutions and made a positive contribution to our country's future. The one competition that matters internationally is education. And big-time sports spending carries an educational cost.
It is noteworthy and undoubtedly a relief to our president-elect that college sports is one American big business not looking for a federal bailout. But this is not because NCAA programs across the country are making money. The most recent NCAA study reports that only 17 of the 1,200+ NCAA athletic programs earned a net profit in the economically healthy period between 2004 and 2006. In 2006, 99 Division 1-A programs ran deficits. The average was $8.9 M. Since most universities cannot run deficits, the money for big-time sports spending comes from institutional academic budgets.
Second, universities already get a big handout from the federal government. By making skybox rental fees and mandatory donations for ticket-purchasing privileges tax deductible, our government actually encourages universities to build stadiums and arenas laden with luxury sky-boxes and other kinds of preferred seating. That's where the big "tax-deductible" money is.
Wealthy sports boosters like Phil Knight (the University of Oregon) and T. Boone Pickens (Oklahoma State University) can write off their gifts of $100 million or more to sports programs as donations to higher education.
Congressional committees have examined these loopholes recently and not made any moves towards changing them. However, in fall 2007, the daughter of the late U.S. Rep J.J. (Jake) Pickle, who authored the original Pickle Amendment that created the loopholes, was appalled to learn of the extravagant spending of the University of Texas athletics department. She wrote the Austin American-Statesman that her father never intended that "our sports programs" would "eclipse the purpose of the University of Texas."
There are also other hidden educational costs to sports spending. U.T.'s sprawling sports facilities take up precious building space on campus, even as the Texas Higher Education Coordinating Board has told the university that it needs an additional 1.4 million square feet of classroom and laboratory facilities to give its 50,000 students a satisfactory education. At the University of Oregon, 60 percent of all campus building projects in the past 10 years have been for intercollegiate athletics, including a new, palatial $200 million basketball arena financed by state bonds.
Then there is the effect on "wannabe" institutions. The UT Board of Regents on December 18 approved a plan for the University of Texas at San Antonio to start a big-time football program by 2016. The plan involves doubling student athletics fees at the 28,000-student institution to $480 annually in order to generate about 70 percent of the conservatively estimated $18 million yearly budget the football program will require. Meanwhile the Texas-San Antonio library reports that it is still occupying the same space it did when the campus opened in 1973 and has inadequate room for its print collection, computers, and student study areas.
Some university leaders have taken salary cuts in response to the fiscal crisis enveloping our universities. But they have not touched the compensation of big-time sports coaches. In November, UT's head baseball coach received a 25 percent raise, to $1 million, and an assistant football coach was designated heir apparent to the head coach's $3 million position. His salary will be raised in January 112 percent -- that's not a typo -- to $900,000, 50 percent more than UT's president makes. Remember, that's for an assistant coach.
Outdoing the Joneses for once, the University of Oregon anointed one of its assistant football coaches a head-coach-in-waiting, too, at $7 million over 5 years. Meanwhile Oregon, like many other universities, cut its academic budget this fall, resulting in fewer courses, larger class sizes and decreased student services.
Oregon and UT played in bowl games this year. What was the cost of getting there? One cost is that the football players on their teams are "bottom-feeders" in the annual Higher Ed Watch's Academic BCS Rankings, based on their abysmal graduation rates and their poor graduation ratio between black and white players. The second cost is in dollars that could be going to academic needs. UT's athletics budget works out to $244,684 per year for each of its 511 athlete-students, but its official student-related expenditures are $11,344 for each student. Oregon spends $108,000 per year for each athlete-student and $9,222 for each enrolled student. The stats are similar at other Division 1 NCAA institutions.
Other countries are beating us in education by wisely using their financial resources not for sports entertainment, but on classrooms, libraries and laboratories. American children are less well educated and have fewer career opportunities than their parents. They have less hope.
Creating more hope is what Barack Obama is all about. Let us hope he uses his influence to get Congress to close the loopholes that have perverted our higher educational priorities, and that he directs our new secretary of education to work seriously on getting university leaders across the country to focus on the one bowl game that truly matters: education.
Tom Palaima and Nathan Tublitz
Tom Palaima is the Raymond F. Dickson Centennial Professor of Classics at the University of Texas at Austin, and Nathan Tublitz is a professor of biology at the University of Oregon.
For the 11th time since World War II, boom has turned to bust in our economy. Recession brings change in both the public and private sectors, as industries and government are forced to rethink how and to whom they deliver products and services. The current recession will be no exception.
Higher education’s response to economic downturns, however, has changed little. States and their colleges and universities have used the same strategy in every recession of the past generation, doing less of the same -- reducing access, cutting programs and services -- and charging students and their families more. During each of the last three recessions, average tuition and fees at public colleges and universities have climbed nearly 25 percent, and enrollment has fallen in two of these recessions.
Choosing retrenchment over reform has helped to make college more expensive and less accessible and affordable. Since the last recession of 2001, the U.S. has fallen to tenth in the percentage of young adults with a college degree, the share of income needed for the poorest family to pay public college expenses after financial aid has jumped from 39 percent to 55 percent, and student loan borrowing has nearly doubled.
The world surrounding higher education has changed significantly since the last recession, in ways that make a repeat of past behavior riskier than before.
Eight years ago, the knowledge economy was still developing, and the Baby Boomers -- our best-educated generation -- were still in the prime of their working lives. Today, half of the fastest-growing jobs require education beyond high school, and the first of the Baby Boomers will reach retirement age in just two years. This means that millions of college-educated workers will be needed to fill new and existing jobs, and our current completion rates won’t meet that need.
Eight years ago, two-thirds of Americans believed that success in the work force didn’t require a college degree and a majority thought that qualified students could get to and through college. Today, more than half of Americans say that a college education is essential, and two-thirds say that eligible students are being shut out of college. The public’s demand for access to higher education and their confidence in colleges’ and universities’ ability to deliver it are on a collision course.
Despite these warning signs, we’re already seeing history repeat itself. Lawmakers in Florida are moving to allow every public university to increase tuition by as much as 15 percent per year despite widespread public opposition. Three of the nation’s largest public university systems -- the University of California, California State University, and Arizona State University -- are proceeding with plans to cap or cut enrollment amid rapid growth in their states’ college-going populations.
How do we break this cycle and redefine higher education’s response to financial crisis? It will require strong leadership at the state, system, and campus levels, focusing on priorities, productivity, and innovation.
Setting priorities involves hard choices. We believe that in the current financial crisis, ensuring accessible and affordable undergraduate education must be the highest priority. States should not cut higher education disproportionately compared to other state services and rely on students to make up the difference through tuition hikes. Colleges and universities should share resources to ensure that every eligible student can enroll, and redirect resources from high cost, low need graduate and research programs to undergraduate instruction. Both should make financial need the top priority for their student aid funds.
We see encouraging signs on this front. Governors in Maryland, Michigan, and Missouri have proposed shielding higher education from cuts in exchange for tuition freezes. In Pennsylvania, Gov. Ed Rendell has proposed a bold effort to increase need-based aid for students attending community and state colleges.
Gauging and increasing productivity is also a must. State, system, and campus leaders need to look at how money is being spent and the results of that spending, rather than simply focusing on revenues. They must also set clear expectations for institutions to regularly review these data and use them to reform or eliminate high cost, low performing programs and reinvest the savings in areas consistent with state needs and priorities.
There are positive developments in this area as well. The National Association of System Heads is working with public university systems in nearly 20 states to better measure and manage costs as part of a broader push to improve participation and completion rates for underrepresented students. One of the participating systems -- Mississippi Institutions of Higher Learning -- has changed its budget development process to include a focus on institutional spending, not just campus wish lists.
The third -- and perhaps most important -- element is innovation. Our colleges and universities are renowned for the innovations that they bring to other fields, but they focus relatively little on their own reinvention. Many promising initiatives, including dual high school/college enrollment and course redesign, operate on marginal dollars in good times and are the first to be cut when budgets tighten.
Here again, some states are showing leadership. Policy makers in Indiana, Ohio, Tennessee, and Texas are exploring new funding models that would include real incentives for retaining and graduating students, not just enrolling them.
Recessions are inevitable, but our responses to them are not. Policy makers and higher education leaders who once again decide to do less of the same and charge more for it will tell us that they had no other choice. But we know that just isn’t true.
Patrick M. Callan and Robert H. Atwell
Patrick M. Callan is president of the non-profit, non-partisan National Center for Public Policy and Higher Education. Robert H. Atwell is president emeritus of the American Council on Education, serves on the National Center’s board of directors, and chairs the board of directors of the Delta Project on Postsecondary Costs, Productivity, and Accountability.
Grand Theft Auto. America’s Army. Spore. The Sims. Chain Factor. Halo. Guitar Hero. City of Heroes. Left for Dead. Fable. World of Warcraft. Everquest. Warhammer. These are titles of video games our students are playing when not attending or studying for our classes! On average college students are spending 50-100 hours mastering each of these games. This may make you question: How much time are they spending on my class?
We are entrepreneurship professors at a very entrepreneurial institution, Babson College. Recently we became interested (some of our colleagues would say obsessed) with video games, not simulations, and how they can be used in higher education. Over a whimsical e-mail exchange in late 2008 we asked each other, “If we could create a video game where students could ‘experience’ entrepreneurship, what would it look like? What would it feel like? What and how would they learn?”
Be careful what questions you pose in life because our view of the world has been dramatically altered after embarking on an “expedition” to answer the question. We can’t give away the answer just yet but we can share part of our journey. In fact, we’re eager to share our exploration of this space to see how those of us in higher education might best embrace the reality of virtual worlds.
We must confess; we are not gamers. For the most part we are still stuck in the days of Pac-Man, Asteroids, and Centipede, but we openly admit that our cool factor is increasing because we have been caught playing Wii Tennis and Guitar Hero! But there’s something invigorating about learning something entirely new and we don’t think we realized exactly how much we didn’t know until we played a little hooky and took a field trip to the industry Mecca - GDC. For the uninitiated, this is the annual Game Developers Conference. The week-long conference started with two days of “Summits” devoted to different areas of gaming such as artificial intelligence, mobile gaming, casual games, and virtual worlds. We attended the serious game summit that “spotlights the rapidly growing serious games industry that features the use of interactive games technology within non-entertainment sectors. The summit provides a forum for game developers and industry professionals to examine the future course of serious games development in areas such as education, government, health, military, science, corporate training, first responders, and social change.”
We learned about the human-interest sides of the gaming industry, such as a sign of experience, and therefore status, is not only wearing jeans and T-shirts but also wearing GDC shirts from previous conferences. As business school professors, well let’s just say, we didn’t bring any T-shirts, or at least any we would wear in public. As any good conformist would do, however, we bought GDC shirts on the first day and trust us when we say the crowds in the GDC store were on par with those in an Apple store during the holiday season. Never have we been at a conference where “while supplies last” really means something.
Gaming is serious business both economically and socially. Consumers spend $25 billion a year on video games and game components and there are an estimated 800 million gamers worldwide. But the social upside of gaming is either misunderstood, or at the least, not yet well or broadly understood. Games such as Grand Theft Auto and Postal have inappropriately defined the industry as one that promotes aggressive and violent behavior.
But for the sake of argument we must consider the corollary. If games can promote negative behavior can they not also promote positive behavior? The opening speaker of the Serious Games Summit, Austin Hill of Akoha, asked a compelling and poignant question, “What if playing a game could make the world a better place?” And we quickly learned that some games are making a world a better place. Games that aim to have a positive social impact are among the fastest growing of all serious games segments. These games are unleashing the imagination of our youth – an imagination that should be cultivated to navigate the complexity and uncertainty of the “real world.”
We learned that lines between the real world and virtual worlds are blurring. During a case study presentation on an emerging virtual world game for young children, the designers spoke about the challenge of very young gamers not seeing the distinction between the physical and virtual worlds. The purpose of the game was to have children design a virtual toy that they would then go buy in physical form. The language of the game encouraged children to make their toy “real.” The children did not understand the terms “make it real” because the virtual toy in their mind’s eye was already real. Whether virtual or real, it was all about play.
Gaming, serious and casual alike, can promote a culture of empathy. During one of the very first sessions the speaker presented a selection of quotes from young gamers. One young gamer said that gaming made him emotional. He felt hardened by reality but games allowed him to release emotions that would have otherwise remained dormant. Rather than desensitizing our youth, games are allowing students to explore what Will Wright, creator of the Sims franchise and Spore, called the “possibility space.” Every game has a beginning and end but today’s advanced games allows each player to create a unique path while seeing, experiencing, and perhaps even feeling the consequence of their decisions.
The necessity of collaboration was ubiquitous. Even the GDC bookstore inspired us to think about education and gaming in a different way. The number of books on display that crossed disciplines, modes of learning, future levels of intelligence, and task oriented programming was quite striking. We saw books on creativity, managing leadership, developing a team, and getting your product in market. Ironically this is what we see at our business school conferences. The world is getting smaller.
Taking center stage in the store were books on art, mythology, writing and storytelling, sociology, and anthropology. The world is getting more integrated. While many of the speakers throughout the Serious Games Summit talked about the importance of teams with each team member having an important skill set, they also talked about the need to have team members understand the perspective of others. It wasn’t enough to be the pure programmer or be the pure content expert. You needed to have an understanding of what the other was going to do to have a truly excellent product. We started thinking further about our academic tradition of silos and what this really means for the future of higher education. The world, virtual and real, does not exist in silos.
Overall, the future of cyberspace is analogous to the future of business – new worlds, new actors, new ways of navigating, new outcomes, new pathways, and broader, more integrated, ways of thinking. What will our avatar look like? And will it be buying a new corporate jet with federal government stimulus money?
In general, our classrooms are filled with discussions related to the economy and global business challenges. It’s not only a good time to review our business models but to rethink the actual role of business in society and how we teach. We teach business from traditional models developed, for the most part, many, many decades ago. Is this really the best we can do? Are games possibly teaching the things we don’t, won’t or can’t?
At the beginning of the Serious Games Summit we had decided to use a video game design approach to help us try and think in a more “gaming way” about what we were learning and its application to entrepreneurship within higher education. To do so, we bought a box of cards called The Art of Game Design: A Deck of Lenses, by Jesse Schell. The box (with accompanying book) claims to be “The Ultimate Creativity Toolkit for Game Design.” Our approach was simple. Randomly pick a card from the deck at the beginning of every session, write it down, and see if it speaks us in some way at the moment or later. The cards we chose, 15 in total, created an uncanny story of our experience at the GDC. We offer a glimpse of three of the cards chosen over the course of two days.
The first card chosen from our brand new deck of game design cards was The Lens of Secret Purpose and it asked, “Why am I doing this?” Yes, we laughed but our purpose was simple. We are curious; we are insatiable learners; and we passionately believe that we need to find better ways of teaching and learning.
Another card was The Lens of Endogenous Value that asked us to consider the “relationship between value in the game and the player’s motivations.” We extended this to think about the motivation of our current generation of students and the connection or disconnection to our pedagogy. Higher education must be more than workforce development, even in times of economic crisis. Perhaps especially in times of economic crisis.
Yet another card chosen was called The Lens of the Crystal Ball, which happened to be the last card we chose of the conference. The card stated, “If you would know the future of a particular game technology, ask yourself these questions. What will ____ be like two years from now? What will ____ be like four years from now? What will ____ be like ten years from now? Why?” Think about it. Higher education is a game. We have a start, finish, and many possibility spaces – the pathways our students choose to navigate their college experience. The difference between video games and higher education as a game is the pace of change. A game introduced today will look considerably different in four years. Can we say the same about curriculum?
The world of game design is about play, experiencing and creating empathy, collaboration, and future thinking. It emphasizes purpose and value, and recognizes the constant need to adapt and embrace new technology. Imagine the world today if we replaced the words “game design” in the first sentence of this paragraph with the words “higher education.” We definitely think the time has come to embrace the reality of virtual worlds!
Patricia G. Greene and Heidi M. Neck
Patricia G. Greene and Heidi M. Neck are professors of entrepreneurship at Babson College.
At first glance, Peter Drucker might seem an unlikely candidate to have published an academic novel. Famous for writing books such as Concept of the Corporation and The Effective Executive, Drucker was dubbed “The Man Who Invented Management” in his 2005 Business Week obituary. Drucker’s audience was to be found among the Harvard Business Review crowd, not the Modern Language Association coterie, and, not surprisingly, his two novels are no longer in print.
But the university he presented in his 1984 novel, The Temptation to Do Good, confronted some key questions that face higher education institutions in today’s unprecedented financial downturn: Are current practices sustainable? Have we strayed from our core mission? Will the liberal arts survive increasing budget pressures?
As these questions -- hardly the usual literary fare -- demonstrate, Drucker’s work is a rarity among academic novels. These texts typically provide a send-up of academic life, by making fun of intellectual trends through characters such as Jack Gladney, who chairs the department of Hitler studies in Don DeLillo’s White Noise, or by parodying the pettiness of department politics, as in Richard Russo’s Straight Man, in which one English professor’s nose is mangled during a personnel committee meeting, courtesy of a spiral notebook thrown at him by one of his peers. By contrast, The Temptation to Do Good is almost painstakingly earnest in its portrayal of Father Heinz Zimmerman, president of the fictional St. Jerome University.
Like other contemporary academic novels, The Temptation to Do Good depicts the problems of political correctness, the tensions between faculty and administration, and the scandal of inter-office romance. But St. Jerome’s problems are no laughing matter. Lacking the improbable events of other academic novels -- in James Hynes’s The Lecturer’s Tale, the adjunct-protagonist even gains super-human powers -- the plot of The Temptation to Do Good is completely plausible, and the problems above destroy a good man.
St. Jerome’s chemistry department decides not to hire Martin Holloway, a job candidate with a less-than-stellar research record. Feeling sorry for the soon-to-be-unemployed Ph.D., Zimmerman decides to recommend Holloway to the dean of a nearby small college. Zimmerman knows he shouldn’t interfere, but he feels he must do the Christian thing, and so, succumbing to “the temptation to do good,” he makes the call. Meanwhile, Holloway’s angry wife spreads unfounded rumors about a dalliance between the president-priest and his female assistant. The faculty overreact to both events, and although most of them come to regret it, Zimmerman’s presidency is brought down, and he is eased out by the church into a sinecure government position.
Often reading like an intricate case study of one university’s internal politics, The Temptation to Do Good aims to do more than that, too, raising questions about the purpose of higher education institutions writ large. Representing the contemporary university as a large, bureaucratic institution -- much like the companies that Drucker’s theories would shape -- The Temptation to Do Good portrays Zimmerman as a successful executive, one who “converted a cow college in the sticks” into a national university with a reputation unrelated to its religious roots. He even makes the cover of Time magazine for increasing his endowment by a larger percentage than any other university over the past five years.
Although some faculty recognize, as one physics professor admits, that they wouldn’t be able to do their research without the money he has brought in, many of them are also disenchanted with Father Zimmerman, CEO. The chemistry chair chose to come to St. Jerome because he expected it to be “less corrupted by commercialism and less compromised by the embrace of industry” than other institutions, which he realizes isn’t the case.
“We have a right,” says the chair of modern languages, upset over the abolition of the language requirement, “to expect the President of a Catholic university to stand up for a true liberal education.” In both cases, we see the ideals of a Catholic university being linked to the ideals of a liberal arts education, both focused on a pure devotion to the pursuit of knowledge seen as incompatible with Zimmerman’s expanded professional schools and intimate sense of students’ consumer needs. Can St. Jerome be true to both the liberal arts and the practical, professionalized realm at the same time?
This question is never resolved in the novel, but outside of his fiction writing, Drucker was deeply interested in the practicality of the liberal arts. In his autobiography, he discusses his deep appreciation of Bennington College, a school designed to combine progressive methods -- connecting learning to practical experience -- with the ideas of Robert Hutchins, the University of Chicago president and famed proponent of classical liberal ideals. William Whyte’s sociological classic Organization Man cites Drucker as saying that “the most vocational course a future businessman can take is one in the writing of poetry or short stories.”
Although Drucker was unusual in actually writing novels himself, he was not alone among business thinkers in expressing the values of the liberal arts. Tom Peters and Robert Waterman’s In Search of Excellence: Lessons from America’s Best-Run Companies describes an investment banker who suggests closing business schools and providing students with a “liberal arts literacy,” that includes “a broader vision, a sense of history, perspectives from literature and art.”
More recently, Thomas Friedman’s The World is Flat includes a section focusing on the importance of a liberal arts education in the new integrated, global economy. “Encouraging young people early to think horizontally and to connect disparate dots has to be a priority,” writes Friedman, “because this is where and how so much innovation happens. And first you need dots to connect. And to me that means a liberal arts education.”
Books like Rolf Jensen’s The Dream Society: How the Coming Shift from Information to Imagination will Transform Your Business, Joseph Pine II and James H. Gilmore’s The Experience Economy: Work is Theatre and Every Business a Stage, Daniel H. Pink’s A Whole New Mind: Why Right Brainers Will Rule the Future, and Richard Lanham’s The Economics of Attention: Style and Substance in the Information Age make these points more specifically, often showing how certain “literary” skills, such as storytelling and empathy, are crucial to success in the current time.
Out of the authors mentioned above, only Lanham is a humanities professor, and in a field (rhetoric) largely out of scholarly vogue today. “Let’s go back to the subject of English a moment. Of all subjects none is potentially more useful,” Whyte writes. “That English is being slighted by business and students alike does not speak well of business. But neither does it speak well for English departments.”
What’s significant about Whyte’s account -- along with that of Drucker, Friedman, and others -- is that none of them claim that colleges and universities should merely churn out students of technical writing or focus on the practicality of the composition course; instead they want students to think about narrative complexity and story-telling through the liberal arts. Whyte himself focuses on the study of Shakespeare and Charles Lamb.
However, instead of embracing these potential real-world allies, liberal arts disciplines have seemed to withdraw, letting others become the experts in -- and proponents of -- the relevance of their subjects. Consider, for example, that in January 2008, one of the most famous English professors in the world proclaimed on his New York Times blog that the study of literature is useless. Asserting that the humanities don’t do anything but give us pleasure, Stanley Fish wrote that, “To the question of ‘what use are the humanities?’ the only honest answer is none whatsoever.” The arts and humanities, Fish contended, won’t get you a job, make you a well-rounded citizen, or ennoble you in any way.
Not surprisingly, readers were appalled. Within the next 48 hours, 484 comments were posted online, most of them critical of Fish. The majority of these comments, from a mix of scientists, humanists, business people, and artists, could be divided into two categories: first, the humanities are useful because they provide critical thinking skills that are useful for doing your job, whether you’re a doctor or CEO; and second, the humanities are useful for more than just your job, whether that means being a more informed citizen or simply a more interesting conversationalist.
However, perhaps the most fascinating comments came from those who recognized Fish’s stance as a professional one: in other words, one that relates to attitudes toward the humanities held by practitioners inside the academy (professors), as distinct from those held by general educated readers outside it (the Times audience). “Let’s not conflate some academics -- those who have professionalized their relationship with the humanities to the point of careerist cynicism -- with those [...] still capable of a genuine relationship to the humanities,” said one reader. Another added that the “humanities have been taken over by careerists, who speak and write only for each other.”
In other words, while readers defend the liberal arts’ relevance, scholars, who are busy writing specialized scholarship for one another, simply aren’t making the case. This was an interesting debate when Fish wrote his column over a year ago; now in 2009, we should consider it an urgent one.
Traditionally, economic downturns are accompanied by declines in the liberal arts, and with today’s unparalleled budget pressures, higher education institutions will need to scrutinize the purpose of everything they do as never before. Drucker’s academic novel provides an illustrative example of the liberal arts at work: as Fish’s readers would point out, literature can raise theoretical questions that help us understand very practical issues.
To be sure, the liberal arts are at least partly valuable because they are removed from practical utility as conceived in business; the return on investment from a novel can’t be directly tied to whether it improves the reader’s bottom line.
But justifiable concerns among scholars that the liberal arts will become only about utility has driven the academy too far in the opposite direction. Within higher education, we acknowledge that the writing skills gained in an English seminar might help alumni craft corporate memos, but it is outside higher education where the liveliest conversations about the liberal arts’ richer benefits -- empathic skills and narrative analysis, for example -- to the practical world seem to occur.
Drucker and his antecedents may be raising the right questions, but these discussions should be equally led by those professionally trained in the disciplines at hand. In today’s economic climate, it may become more important than ever for the liberal arts to mount a strong defense -- let’s not leave it entirely in the hands of others.
Melanie Ho is a higher education consultant in Washington. She has taught literature, writing and leadership development courses at the University of California at Los Angeles.