For the 11th time since World War II, boom has turned to bust in our economy. Recession brings change in both the public and private sectors, as industries and government are forced to rethink how and to whom they deliver products and services. The current recession will be no exception.
Higher education’s response to economic downturns, however, has changed little. States and their colleges and universities have used the same strategy in every recession of the past generation, doing less of the same -- reducing access, cutting programs and services -- and charging students and their families more. During each of the last three recessions, average tuition and fees at public colleges and universities have climbed nearly 25 percent, and enrollment has fallen in two of these recessions.
Choosing retrenchment over reform has helped to make college more expensive and less accessible and affordable. Since the last recession of 2001, the U.S. has fallen to tenth in the percentage of young adults with a college degree, the share of income needed for the poorest family to pay public college expenses after financial aid has jumped from 39 percent to 55 percent, and student loan borrowing has nearly doubled.
The world surrounding higher education has changed significantly since the last recession, in ways that make a repeat of past behavior riskier than before.
Eight years ago, the knowledge economy was still developing, and the Baby Boomers -- our best-educated generation -- were still in the prime of their working lives. Today, half of the fastest-growing jobs require education beyond high school, and the first of the Baby Boomers will reach retirement age in just two years. This means that millions of college-educated workers will be needed to fill new and existing jobs, and our current completion rates won’t meet that need.
Eight years ago, two-thirds of Americans believed that success in the work force didn’t require a college degree and a majority thought that qualified students could get to and through college. Today, more than half of Americans say that a college education is essential, and two-thirds say that eligible students are being shut out of college. The public’s demand for access to higher education and their confidence in colleges’ and universities’ ability to deliver it are on a collision course.
Despite these warning signs, we’re already seeing history repeat itself. Lawmakers in Florida are moving to allow every public university to increase tuition by as much as 15 percent per year despite widespread public opposition. Three of the nation’s largest public university systems -- the University of California, California State University, and Arizona State University -- are proceeding with plans to cap or cut enrollment amid rapid growth in their states’ college-going populations.
How do we break this cycle and redefine higher education’s response to financial crisis? It will require strong leadership at the state, system, and campus levels, focusing on priorities, productivity, and innovation.
Setting priorities involves hard choices. We believe that in the current financial crisis, ensuring accessible and affordable undergraduate education must be the highest priority. States should not cut higher education disproportionately compared to other state services and rely on students to make up the difference through tuition hikes. Colleges and universities should share resources to ensure that every eligible student can enroll, and redirect resources from high cost, low need graduate and research programs to undergraduate instruction. Both should make financial need the top priority for their student aid funds.
We see encouraging signs on this front. Governors in Maryland, Michigan, and Missouri have proposed shielding higher education from cuts in exchange for tuition freezes. In Pennsylvania, Gov. Ed Rendell has proposed a bold effort to increase need-based aid for students attending community and state colleges.
Gauging and increasing productivity is also a must. State, system, and campus leaders need to look at how money is being spent and the results of that spending, rather than simply focusing on revenues. They must also set clear expectations for institutions to regularly review these data and use them to reform or eliminate high cost, low performing programs and reinvest the savings in areas consistent with state needs and priorities.
There are positive developments in this area as well. The National Association of System Heads is working with public university systems in nearly 20 states to better measure and manage costs as part of a broader push to improve participation and completion rates for underrepresented students. One of the participating systems -- Mississippi Institutions of Higher Learning -- has changed its budget development process to include a focus on institutional spending, not just campus wish lists.
The third -- and perhaps most important -- element is innovation. Our colleges and universities are renowned for the innovations that they bring to other fields, but they focus relatively little on their own reinvention. Many promising initiatives, including dual high school/college enrollment and course redesign, operate on marginal dollars in good times and are the first to be cut when budgets tighten.
Here again, some states are showing leadership. Policy makers in Indiana, Ohio, Tennessee, and Texas are exploring new funding models that would include real incentives for retaining and graduating students, not just enrolling them.
Recessions are inevitable, but our responses to them are not. Policy makers and higher education leaders who once again decide to do less of the same and charge more for it will tell us that they had no other choice. But we know that just isn’t true.
Patrick M. Callan and Robert H. Atwell
Patrick M. Callan is president of the non-profit, non-partisan National Center for Public Policy and Higher Education. Robert H. Atwell is president emeritus of the American Council on Education, serves on the National Center’s board of directors, and chairs the board of directors of the Delta Project on Postsecondary Costs, Productivity, and Accountability.
Grand Theft Auto. America’s Army. Spore. The Sims. Chain Factor. Halo. Guitar Hero. City of Heroes. Left for Dead. Fable. World of Warcraft. Everquest. Warhammer. These are titles of video games our students are playing when not attending or studying for our classes! On average college students are spending 50-100 hours mastering each of these games. This may make you question: How much time are they spending on my class?
We are entrepreneurship professors at a very entrepreneurial institution, Babson College. Recently we became interested (some of our colleagues would say obsessed) with video games, not simulations, and how they can be used in higher education. Over a whimsical e-mail exchange in late 2008 we asked each other, “If we could create a video game where students could ‘experience’ entrepreneurship, what would it look like? What would it feel like? What and how would they learn?”
Be careful what questions you pose in life because our view of the world has been dramatically altered after embarking on an “expedition” to answer the question. We can’t give away the answer just yet but we can share part of our journey. In fact, we’re eager to share our exploration of this space to see how those of us in higher education might best embrace the reality of virtual worlds.
We must confess; we are not gamers. For the most part we are still stuck in the days of Pac-Man, Asteroids, and Centipede, but we openly admit that our cool factor is increasing because we have been caught playing Wii Tennis and Guitar Hero! But there’s something invigorating about learning something entirely new and we don’t think we realized exactly how much we didn’t know until we played a little hooky and took a field trip to the industry Mecca - GDC. For the uninitiated, this is the annual Game Developers Conference. The week-long conference started with two days of “Summits” devoted to different areas of gaming such as artificial intelligence, mobile gaming, casual games, and virtual worlds. We attended the serious game summit that “spotlights the rapidly growing serious games industry that features the use of interactive games technology within non-entertainment sectors. The summit provides a forum for game developers and industry professionals to examine the future course of serious games development in areas such as education, government, health, military, science, corporate training, first responders, and social change.”
We learned about the human-interest sides of the gaming industry, such as a sign of experience, and therefore status, is not only wearing jeans and T-shirts but also wearing GDC shirts from previous conferences. As business school professors, well let’s just say, we didn’t bring any T-shirts, or at least any we would wear in public. As any good conformist would do, however, we bought GDC shirts on the first day and trust us when we say the crowds in the GDC store were on par with those in an Apple store during the holiday season. Never have we been at a conference where “while supplies last” really means something.
Gaming is serious business both economically and socially. Consumers spend $25 billion a year on video games and game components and there are an estimated 800 million gamers worldwide. But the social upside of gaming is either misunderstood, or at the least, not yet well or broadly understood. Games such as Grand Theft Auto and Postal have inappropriately defined the industry as one that promotes aggressive and violent behavior.
But for the sake of argument we must consider the corollary. If games can promote negative behavior can they not also promote positive behavior? The opening speaker of the Serious Games Summit, Austin Hill of Akoha, asked a compelling and poignant question, “What if playing a game could make the world a better place?” And we quickly learned that some games are making a world a better place. Games that aim to have a positive social impact are among the fastest growing of all serious games segments. These games are unleashing the imagination of our youth – an imagination that should be cultivated to navigate the complexity and uncertainty of the “real world.”
We learned that lines between the real world and virtual worlds are blurring. During a case study presentation on an emerging virtual world game for young children, the designers spoke about the challenge of very young gamers not seeing the distinction between the physical and virtual worlds. The purpose of the game was to have children design a virtual toy that they would then go buy in physical form. The language of the game encouraged children to make their toy “real.” The children did not understand the terms “make it real” because the virtual toy in their mind’s eye was already real. Whether virtual or real, it was all about play.
Gaming, serious and casual alike, can promote a culture of empathy. During one of the very first sessions the speaker presented a selection of quotes from young gamers. One young gamer said that gaming made him emotional. He felt hardened by reality but games allowed him to release emotions that would have otherwise remained dormant. Rather than desensitizing our youth, games are allowing students to explore what Will Wright, creator of the Sims franchise and Spore, called the “possibility space.” Every game has a beginning and end but today’s advanced games allows each player to create a unique path while seeing, experiencing, and perhaps even feeling the consequence of their decisions.
The necessity of collaboration was ubiquitous. Even the GDC bookstore inspired us to think about education and gaming in a different way. The number of books on display that crossed disciplines, modes of learning, future levels of intelligence, and task oriented programming was quite striking. We saw books on creativity, managing leadership, developing a team, and getting your product in market. Ironically this is what we see at our business school conferences. The world is getting smaller.
Taking center stage in the store were books on art, mythology, writing and storytelling, sociology, and anthropology. The world is getting more integrated. While many of the speakers throughout the Serious Games Summit talked about the importance of teams with each team member having an important skill set, they also talked about the need to have team members understand the perspective of others. It wasn’t enough to be the pure programmer or be the pure content expert. You needed to have an understanding of what the other was going to do to have a truly excellent product. We started thinking further about our academic tradition of silos and what this really means for the future of higher education. The world, virtual and real, does not exist in silos.
Overall, the future of cyberspace is analogous to the future of business – new worlds, new actors, new ways of navigating, new outcomes, new pathways, and broader, more integrated, ways of thinking. What will our avatar look like? And will it be buying a new corporate jet with federal government stimulus money?
In general, our classrooms are filled with discussions related to the economy and global business challenges. It’s not only a good time to review our business models but to rethink the actual role of business in society and how we teach. We teach business from traditional models developed, for the most part, many, many decades ago. Is this really the best we can do? Are games possibly teaching the things we don’t, won’t or can’t?
At the beginning of the Serious Games Summit we had decided to use a video game design approach to help us try and think in a more “gaming way” about what we were learning and its application to entrepreneurship within higher education. To do so, we bought a box of cards called The Art of Game Design: A Deck of Lenses, by Jesse Schell. The box (with accompanying book) claims to be “The Ultimate Creativity Toolkit for Game Design.” Our approach was simple. Randomly pick a card from the deck at the beginning of every session, write it down, and see if it speaks us in some way at the moment or later. The cards we chose, 15 in total, created an uncanny story of our experience at the GDC. We offer a glimpse of three of the cards chosen over the course of two days.
The first card chosen from our brand new deck of game design cards was The Lens of Secret Purpose and it asked, “Why am I doing this?” Yes, we laughed but our purpose was simple. We are curious; we are insatiable learners; and we passionately believe that we need to find better ways of teaching and learning.
Another card was The Lens of Endogenous Value that asked us to consider the “relationship between value in the game and the player’s motivations.” We extended this to think about the motivation of our current generation of students and the connection or disconnection to our pedagogy. Higher education must be more than workforce development, even in times of economic crisis. Perhaps especially in times of economic crisis.
Yet another card chosen was called The Lens of the Crystal Ball, which happened to be the last card we chose of the conference. The card stated, “If you would know the future of a particular game technology, ask yourself these questions. What will ____ be like two years from now? What will ____ be like four years from now? What will ____ be like ten years from now? Why?” Think about it. Higher education is a game. We have a start, finish, and many possibility spaces – the pathways our students choose to navigate their college experience. The difference between video games and higher education as a game is the pace of change. A game introduced today will look considerably different in four years. Can we say the same about curriculum?
The world of game design is about play, experiencing and creating empathy, collaboration, and future thinking. It emphasizes purpose and value, and recognizes the constant need to adapt and embrace new technology. Imagine the world today if we replaced the words “game design” in the first sentence of this paragraph with the words “higher education.” We definitely think the time has come to embrace the reality of virtual worlds!
Patricia G. Greene and Heidi M. Neck
Patricia G. Greene and Heidi M. Neck are professors of entrepreneurship at Babson College.
At first glance, Peter Drucker might seem an unlikely candidate to have published an academic novel. Famous for writing books such as Concept of the Corporation and The Effective Executive, Drucker was dubbed “The Man Who Invented Management” in his 2005 Business Week obituary. Drucker’s audience was to be found among the Harvard Business Review crowd, not the Modern Language Association coterie, and, not surprisingly, his two novels are no longer in print.
But the university he presented in his 1984 novel, The Temptation to Do Good, confronted some key questions that face higher education institutions in today’s unprecedented financial downturn: Are current practices sustainable? Have we strayed from our core mission? Will the liberal arts survive increasing budget pressures?
As these questions -- hardly the usual literary fare -- demonstrate, Drucker’s work is a rarity among academic novels. These texts typically provide a send-up of academic life, by making fun of intellectual trends through characters such as Jack Gladney, who chairs the department of Hitler studies in Don DeLillo’s White Noise, or by parodying the pettiness of department politics, as in Richard Russo’s Straight Man, in which one English professor’s nose is mangled during a personnel committee meeting, courtesy of a spiral notebook thrown at him by one of his peers. By contrast, The Temptation to Do Good is almost painstakingly earnest in its portrayal of Father Heinz Zimmerman, president of the fictional St. Jerome University.
Like other contemporary academic novels, The Temptation to Do Good depicts the problems of political correctness, the tensions between faculty and administration, and the scandal of inter-office romance. But St. Jerome’s problems are no laughing matter. Lacking the improbable events of other academic novels -- in James Hynes’s The Lecturer’s Tale, the adjunct-protagonist even gains super-human powers -- the plot of The Temptation to Do Good is completely plausible, and the problems above destroy a good man.
St. Jerome’s chemistry department decides not to hire Martin Holloway, a job candidate with a less-than-stellar research record. Feeling sorry for the soon-to-be-unemployed Ph.D., Zimmerman decides to recommend Holloway to the dean of a nearby small college. Zimmerman knows he shouldn’t interfere, but he feels he must do the Christian thing, and so, succumbing to “the temptation to do good,” he makes the call. Meanwhile, Holloway’s angry wife spreads unfounded rumors about a dalliance between the president-priest and his female assistant. The faculty overreact to both events, and although most of them come to regret it, Zimmerman’s presidency is brought down, and he is eased out by the church into a sinecure government position.
Often reading like an intricate case study of one university’s internal politics, The Temptation to Do Good aims to do more than that, too, raising questions about the purpose of higher education institutions writ large. Representing the contemporary university as a large, bureaucratic institution -- much like the companies that Drucker’s theories would shape -- The Temptation to Do Good portrays Zimmerman as a successful executive, one who “converted a cow college in the sticks” into a national university with a reputation unrelated to its religious roots. He even makes the cover of Time magazine for increasing his endowment by a larger percentage than any other university over the past five years.
Although some faculty recognize, as one physics professor admits, that they wouldn’t be able to do their research without the money he has brought in, many of them are also disenchanted with Father Zimmerman, CEO. The chemistry chair chose to come to St. Jerome because he expected it to be “less corrupted by commercialism and less compromised by the embrace of industry” than other institutions, which he realizes isn’t the case.
“We have a right,” says the chair of modern languages, upset over the abolition of the language requirement, “to expect the President of a Catholic university to stand up for a true liberal education.” In both cases, we see the ideals of a Catholic university being linked to the ideals of a liberal arts education, both focused on a pure devotion to the pursuit of knowledge seen as incompatible with Zimmerman’s expanded professional schools and intimate sense of students’ consumer needs. Can St. Jerome be true to both the liberal arts and the practical, professionalized realm at the same time?
This question is never resolved in the novel, but outside of his fiction writing, Drucker was deeply interested in the practicality of the liberal arts. In his autobiography, he discusses his deep appreciation of Bennington College, a school designed to combine progressive methods -- connecting learning to practical experience -- with the ideas of Robert Hutchins, the University of Chicago president and famed proponent of classical liberal ideals. William Whyte’s sociological classic Organization Man cites Drucker as saying that “the most vocational course a future businessman can take is one in the writing of poetry or short stories.”
Although Drucker was unusual in actually writing novels himself, he was not alone among business thinkers in expressing the values of the liberal arts. Tom Peters and Robert Waterman’s In Search of Excellence: Lessons from America’s Best-Run Companies describes an investment banker who suggests closing business schools and providing students with a “liberal arts literacy,” that includes “a broader vision, a sense of history, perspectives from literature and art.”
More recently, Thomas Friedman’s The World is Flat includes a section focusing on the importance of a liberal arts education in the new integrated, global economy. “Encouraging young people early to think horizontally and to connect disparate dots has to be a priority,” writes Friedman, “because this is where and how so much innovation happens. And first you need dots to connect. And to me that means a liberal arts education.”
Books like Rolf Jensen’s The Dream Society: How the Coming Shift from Information to Imagination will Transform Your Business, Joseph Pine II and James H. Gilmore’s The Experience Economy: Work is Theatre and Every Business a Stage, Daniel H. Pink’s A Whole New Mind: Why Right Brainers Will Rule the Future, and Richard Lanham’s The Economics of Attention: Style and Substance in the Information Age make these points more specifically, often showing how certain “literary” skills, such as storytelling and empathy, are crucial to success in the current time.
Out of the authors mentioned above, only Lanham is a humanities professor, and in a field (rhetoric) largely out of scholarly vogue today. “Let’s go back to the subject of English a moment. Of all subjects none is potentially more useful,” Whyte writes. “That English is being slighted by business and students alike does not speak well of business. But neither does it speak well for English departments.”
What’s significant about Whyte’s account -- along with that of Drucker, Friedman, and others -- is that none of them claim that colleges and universities should merely churn out students of technical writing or focus on the practicality of the composition course; instead they want students to think about narrative complexity and story-telling through the liberal arts. Whyte himself focuses on the study of Shakespeare and Charles Lamb.
However, instead of embracing these potential real-world allies, liberal arts disciplines have seemed to withdraw, letting others become the experts in -- and proponents of -- the relevance of their subjects. Consider, for example, that in January 2008, one of the most famous English professors in the world proclaimed on his New York Times blog that the study of literature is useless. Asserting that the humanities don’t do anything but give us pleasure, Stanley Fish wrote that, “To the question of ‘what use are the humanities?’ the only honest answer is none whatsoever.” The arts and humanities, Fish contended, won’t get you a job, make you a well-rounded citizen, or ennoble you in any way.
Not surprisingly, readers were appalled. Within the next 48 hours, 484 comments were posted online, most of them critical of Fish. The majority of these comments, from a mix of scientists, humanists, business people, and artists, could be divided into two categories: first, the humanities are useful because they provide critical thinking skills that are useful for doing your job, whether you’re a doctor or CEO; and second, the humanities are useful for more than just your job, whether that means being a more informed citizen or simply a more interesting conversationalist.
However, perhaps the most fascinating comments came from those who recognized Fish’s stance as a professional one: in other words, one that relates to attitudes toward the humanities held by practitioners inside the academy (professors), as distinct from those held by general educated readers outside it (the Times audience). “Let’s not conflate some academics -- those who have professionalized their relationship with the humanities to the point of careerist cynicism -- with those [...] still capable of a genuine relationship to the humanities,” said one reader. Another added that the “humanities have been taken over by careerists, who speak and write only for each other.”
In other words, while readers defend the liberal arts’ relevance, scholars, who are busy writing specialized scholarship for one another, simply aren’t making the case. This was an interesting debate when Fish wrote his column over a year ago; now in 2009, we should consider it an urgent one.
Traditionally, economic downturns are accompanied by declines in the liberal arts, and with today’s unparalleled budget pressures, higher education institutions will need to scrutinize the purpose of everything they do as never before. Drucker’s academic novel provides an illustrative example of the liberal arts at work: as Fish’s readers would point out, literature can raise theoretical questions that help us understand very practical issues.
To be sure, the liberal arts are at least partly valuable because they are removed from practical utility as conceived in business; the return on investment from a novel can’t be directly tied to whether it improves the reader’s bottom line.
But justifiable concerns among scholars that the liberal arts will become only about utility has driven the academy too far in the opposite direction. Within higher education, we acknowledge that the writing skills gained in an English seminar might help alumni craft corporate memos, but it is outside higher education where the liveliest conversations about the liberal arts’ richer benefits -- empathic skills and narrative analysis, for example -- to the practical world seem to occur.
Drucker and his antecedents may be raising the right questions, but these discussions should be equally led by those professionally trained in the disciplines at hand. In today’s economic climate, it may become more important than ever for the liberal arts to mount a strong defense -- let’s not leave it entirely in the hands of others.
Melanie Ho is a higher education consultant in Washington. She has taught literature, writing and leadership development courses at the University of California at Los Angeles.
Once you've done real estate, casinos, an airline, and reality television, what's left? For Donald Trump, there's always higher education.
On Monday Trump unveiled his own "university," which will sell CD-ROMs and offer online courses in real estate and business. No credit or degrees will be offered, although baseball caps and shirts with the university logo may be purchased ($21.95 for a cap, $39.95 for a golf shirt).