Curriculum

Colleges need to be much more innovative with their curricula (essay)

The Curriculm

As students return to campus this fall, they (along with their families) are facing another round of pricey tuition bills. Last year, tuition, fees, room and board averaged about $20,000 for public four-year universities, according to the College Board. That’s more than double the total cost 25 years ago, even when adjusted for inflation. To add fuel to the fire, in a recent poll, nearly half of undergraduates said they learned “nothing” after their first two years of college.

Interestingly, that’s the period during which many students are required to take courses that don’t align with their interests or career goals and instead are part of the “core curriculum” that is required of all students to graduate.

Both of us are university educators; dismaying to us (and perhaps surprising to the reader) is that such requirements are often arbitrary, and there’s little or no data to support their selection. For organizations grounded in research and experimentation, universities engage in surprisingly little analysis about the educational value of courses they offer, beyond those perfunctory student course evaluations.

The goal of universities should be to develop students into mature adults who are knowledgeable, able to function in complex society and prepared for the next phase of their studies or career.

It’s a small amount of time that costs a great deal of money, and neither should be wasted by requiring students to sit in large lecture halls on the campus, taking introductory-level courses from an arbitrarily-chosen bucket of courses. We need to reconsider that approach.

A group of successful people from across the country, from all walks of life, led by evidence-based educators, could be convened to develop a list of core course requirements that all universities would utilize. They would carefully determine what sort of basic knowledge -- such as algebra, foreign languages, basic science, and so forth -- is really necessary for anybody to be considered “educated.” But they shouldn’t stop there. After making their choices, universities should do what they do best -- experimentation -- to evaluate how different core courses impact outcomes for students. Deciding appropriate outcomes and when to measure them should be part of the process.

Second, whatever is decided, universities should shift these core courses to online instruction. Students’ on-campus time is better spent on other endeavors, and it’s inefficient for every university in the country to design and teach the same core courses. Basic Chemistry is the same, whether it’s taught by a professor in Alaska or Arkansas. 

Instead, universities should create a marketplace of online courses to provide students with the best instruction available, even if it’s not produced locally. Those courses could even be taken before students start college, similar to AP courses hundreds of thousands of high school students take every year.

With those core courses out of the way, universities could direct their resources towards more focused curricula where students don’t just learn basic facts but instead learn to think and function as mature adults.

Small group courses would focus on developing skills like oral and written communication, interaction with peers, team behavior, leadership and -- just as important -- followership. Importantly, the instructors who lead these courses need to be teachers who excel in this type of environment. In many cases, the most impressive professors -- those who’ve been published frequently or have conducted groundbreaking research -- won’t thrive in it. Universities should embrace the value of true teachers for this purpose.

A few of these courses would be required, but they would also be subject to experimentation and demonstration of a contribution to the student’s maturity. On-campus courses might include debate, art and architecture and great books. Students could take an “innovation” course, in which they’d interview professionals in a field of their interest to understand the challenges they face in the real world, then discuss possible solutions with classmates. Students would also be required to keep a personal digital portfolio of their accomplishments throughout their college years, which would form the basis for meetings with their adviser and help with self-reflection.

Students, of course, would take elective courses -- also in small groups -- which would help them prepare for a final capstone course. There, students would work in teams and in coordination with a professor to address a problem in real terms, whether building a model of a device or planning an event that addresses a social issue. This work should be carried out beyond the boundaries of the campus, through interviews with professors or experts across the United States or even internationally. The student should guide their team to write the findings of the work as the final product of an exercise in team leadership in the real world.

American universities are known the world over for faculty who do innovative research. But they haven’t always applied that innovative spirit to their own curricula. As technology advances and the costs of education soars, it’s time for institutions to rethink their approach and focus on preparing mature students to best serve the country’s future generations.

Arthur “Tim” Garson Jr. is director of the Texas Medical Center Health Policy Institute and the former provost of the University of Virginia. Robert C. Pianta is dean of the Curry School of Education at the University of Virginia.

Editorial Tags: 
Image Source: 
Istock/Sami Sert
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

The importance of a core curriculum (essay)

In a recent salvo in what some observers call the “war on the core (curriculum),” Donal O’Shea, president of New College of Florida, points to the disadvantages of the ostensibly rigid and compulsory nature of too many fixed graduation requirements. He alleges that such a dynamic limits opportunities for intellectual exploration and development. Such criticisms might leave readers with the impression that colleges with strong core requirements leave students with little intellectual room to grow.

But naysayers rarely mention this: a thorough seven-subject general education sequence, such as the core curriculum for which the American Council of Trustees and Alumni advocates in its “What Will They Learn?” report, occupies at most 30 semester hours. It provides an unparalleled, diverse intellectual foundation for further study, while still affording students ample opportunity not only to complete their major but also to devote their attention to the topics that personally excite them.

Every educator will join O’Shea in his appreciation of the way that “serendipity” and discovery compose a major part of the excitement of liberal learning -- the “intervention of a gifted professor … taking an inspiring course or excitedly talking over an idea with a friend in a residence hall.” But colleges should not confuse intellectual exploration with the absence of structure and intentional scaffolding of intellectual growth or overlook how profoundly curricular standards help students distinguish between the serious and the trivial.

How does a course on Horror Films and American Culture at the University of Colorado at Boulder equate to an American history course designed to cover a comprehensive study of key events in our nation’s past? What about The Fame Monster: The Cultural Politics of Lady Gaga, offered in 2013 at Indiana University, where the most frequent grade was an A-plus? These misplaced priorities are a predictable consequence of privileging curricular “serendipity” over sound curricular structure.

Students enjoy flexibility, agency and choice -- but they also need and appreciate direction and structure, rather than being left to pick and choose course sequences with limited intellectual coherence. How will students be ready for serendipity when it comes, if they lack the intellectual foundation required to meaningfully engage in those pursuits? A Lumina Foundation study found that students get “tangled up” when they are left with too many choices, lengthening their time to degree. Faculty members and administrators have an obligation to give students the framework they need to grow intellectually and graduate.

Attributing boredom and tedium to required courses, and excitement and joy to curricular choice, simply does not stand up to a logical examination of the facts. A required course can be taught well or taught badly, and the same is true of the most culturally relevant elective. Try telling graduates of core curricula at programs as varied as those at Columbia University, Hampden-Sydney College, Pepperdine University, the University of Dallas and the University of Georgia that their experience was stale and intellectually limited.

The decimation of clear requirements and frameworks is likely a major contributor to a growing sense of drift and disappointment among college graduates -- and their employers. Survey data show that while nearly all provosts believe their institution is doing an excellent job of preparing students for careers, employers sharply disagree -- particularly when it comes to writing and critical thinking. A survey of employers by the Association of American Colleges and Universities found that only 26 percent deemed the critical thinking skills of their recently hired college graduates excellent. Just 23 percent thought that recent graduates were well prepared at “applying knowledge/skills to the real world.”

“Serendipity” is ultimately a poor substitute for academic leadership, for a board, administration and expert faculty coming together, determining the priority skills and knowledge that equip graduates for successful careers and informed citizenship, and then having the determination to reify those priorities in requirements -- not aspirations. Privileging faculty excitement over the needs of students -- about to face a ferocious, globalized job market -- is academic malpractice.

The survival of the liberal arts tradition demands that colleges act with urgency to clarify their requirements and expectations of students. Costs and sagging class enrollments are threatening entire majors and departments in essential subjects such as physics, philosophy and foreign language on many college campuses. Students at some liberal arts colleges are opting into vocational courses such as accounting and computer science.

Without rigor and cohesive requirements, the liberal arts will eventually confront a future of irrelevance. What’s called for here is a rigorous liberal arts education and facing up to our responsibility as standard-bearers in that process. Employers, taxpayers, parents and students are quite reasonably demanding more from higher education. Are we listening?

Michael B. Poliakoff is president of the American Council of Trustees and Alumni.

Editorial Tags: 
Image Source: 
iStock
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Scholars' unconvincing case about the value of the humanities (essay)

In our new book, Cents and Sensibility: What Economics Can Learn From the Humanities, we argue that the best of the humanities can help transform the field of economics, making economic models more realistic, predictions more reliable and policies more effective and just.

But what do we mean by the “best” of the humanities? Is it what is often taught in colleges and high schools? If not, might that explain why so many have said the humanities are in “crisis”?

Go to Inside Higher Ed, The Wall Street Journal and The New York Times, or read reports from Harvard University and the American Academy of Arts and Sciences, and you will discover that the humanities are in decline. Enrollments and majors continue to plummet.

But humanities professors themselves, like a delicatessen owner selling spoiled meat and blaming business failure on the vulgarization of consumer taste, fault their students. “All they care about is money,” they complain. “Twitter has reduced their attention span to that of a pithed frog.”

We tell a different story. For decades, literature professors have argued that there is no such thing as “great literature” but only things called great literature because hegemonic forces of oppression have mystified us into believing in objective greatness. One of the commonly taught anthologies among literature professors, The Norton Anthology of Theory and Criticism, paraphrases a key tenet of cultural studies: “Literary texts, like other artworks, are neither more nor less important than any other cultural artifact or practice. Keeping the emphasis on how cultural meanings are produced, circulated and consumed, the investigator will focus on art or literature insofar as such works connect with broader social factors, not because they possess some intrinsic interest or special aesthetic values.” (Editor's Note: This paragraph has been updated to correct a statement about the anthology.)

But if Shakespeare and Milton are no more important than any other “cultural artifact or practice,” and if they are to be studied only “insofar as” they connect with other social factors and not because of “some intrinsic interest or special aesthetic values,” then why invest the considerable effort to read them at all? Perhaps students who don’t take literature courses are responding rationally to their professors’ precepts?

The language about “how cultural meanings are produced, circulated and consumed” gropes for the prestige of something hard, unsentimental and materialistic -- in short, for economics, as a literature professor might imagine it. It appears that humanists’ key strategy for saving their disciplines has been to dehumanize them.

And so we have a host of new movements, announced with the breathless enthusiasm appropriate for discovering the double helix. Sociobiological criticism has shown us how emotions and behaviors described in literature arose to serve an evolutionary purpose. Neuroaesthetics can explain why you love Dante (or Danielle Steele). Find something to count, and you can do digital humanities. In this spirit, we hereby claim to found the nano-humanities. We are not sure what it is, but we are sure that, like these other new disciplines, it will not involve real appreciation of masterpieces. Each of these dehumanities offers something of value, but they matter only if we already have deep appreciation of literature, which you can’t get by deaestheticizing, deliterizing or dehumanizing it.

Many humanists have difficulty in presenting their case because they are used to speaking one way among themselves and another way to outsiders. To the public at large, they still make statements about the value of great books, of the noblest things said by the most brilliant minds and of the need to know the Western heritage. Among themselves, such talk is, at best, hopelessly dated. Perhaps one reason literary scholars make an unconvincing case to outsiders is that they do not believe it themselves.

Students often come to college without having any grasp of what reading great works entails. Their AP and other exams test knowledge of facts about literature, not actually understanding it. Classes teach them to hunt for symbols, to judge writers according to current values, or to treat masterpieces as mere documents of their times. The first method makes reading into a form of puzzle solving, the second allows us to compliment ourselves on our advanced views, and the third misses the point that great literature speaks outside the context of its origin. Tolstoy is not great because he tells us about czarist Russia or the Napoleonic wars.

Each of those common approaches says true things, but none gives any reason to think that reading masterpieces is worth the effort. And that effort is considerable: Paradise Lost is difficult; War and Peace is long. And so the payoff would have to be large. Students would be fools to think otherwise.

No Shortcut

A good sign something has gone astray is that a work is reduced to a simple message. Only mediocre literature can be read that way. Otherwise, why not just memorize messages: love your neighbor (A Tale of Two Cities). Help the unfortunate (Les Misérables). Child abuse is wrong (Jane Eyre and David Copperfield). Do not kill old ladies, even really mean ones (Crime and Punishment). First impressions can be misleading (Pride and Prejudice). Don’t give in to jealousy (Othello). Obsessions can be dangerous (Moby Dick). Stop moping and do something! (Hamlet). There’s no fool like an old fool (King Lear).

If one cannot provide a convincing reason why any brief summaries will not do, then one has not really taught literature. The student needs to know why the book is worth reading, not just knowing about.

There’s no shortcut. One needs not just to analyze “the text” but to experience the work. People are always looking for some way around all that philistine human stuff, but with a novel, one has to identify with the major characters and coexperience their inner lives. Equating the work with the text is like equating music with its score, or expecting a blueprint of a house to keep out the rain. The humanities, especially literature, are about the human.

Here’s an alternative approach: Why not approach great literature as a source of wisdom that cannot be obtained, or obtained so well, elsewhere? There is an obvious proof that the great novelists understand people better than any social scientist who has ever lived. If social scientists understood people as well as Leo Tolstoy or George Eliot, they would have been able to describe people as believable as Anna Karenina or Dorothea Brooke. But not even Freud’s case studies come close. Surely the writers must know something! And great writers present ethical questions with a richness and depth that make other treatments look schematic and simplistic.

Moreover, great literature, experienced and taught the right way, involves practice in empathy. When we read a great novel, we identify with the heroine. We put ourselves in her place, feel her difficulties from within, regret her bad choices. Momentarily, they become our bad choices. Even when we do not like her, we may wince, suffer, put the book down for a while. The process of identification, feeling and examination of feeling may happen not just once but, in the course of a long novel, thousands of times. No set of doctrines is as important for ethical behavior as this constant practice in ethical thought or that direct sensation, felt over and over again, of being in the other person’s place.

The most important lesson novels teach is not a fact or a message but the skill of empathy and of seeing the world from other points of view. Practiced often enough, that skill can become a habit. One cannot get that lesson by reading a summary of “what the author is saying” or “analyzing the text.” One has to experience the work. What could be more important, for ethical and social understanding, than the ability to grasp what it is like to be someone from a different culture, period, social class, gender, religion or personality type? And one learns why even those broad categories won’t do, because one senses what it is like to be a particular other person. And that, too, is an important lesson: no one experiences the world in quite the same way as anyone else.

If we could more easily put ourselves in the position of others and put on a set of glasses to see the world in their way, we might very well, when those glasses are off, still not share their beliefs. But we will at least understand people better, negotiate with them more effectively, or guess what measures are likely to work. Just as important, we will have enlarged our sense of what it is to be human. No longer imprisoned in our own culture and moment, or mistaking our local and current values for only possible ones, we will recognize our beliefs as one of many possibilities -- not as something inevitable, but as a choice.

In short, the humanities, if humanists will only believe in them, have a crucial role to play in education. They have access to truths about human beings that other disciplines have not attained. And while other disciplines may recommend empathy, the humanities allow us to practice it. Their cultivation of diverse points of view offers a model for liberal arts education generally to follow. Properly taught, the humanities offer an escape from the prison house of self. We live on an island in a vast sea of cultures, past and present. The humanities allow us to leave that island and return to it enriched with the wisdom of elsewhere.

If you really want to save the humanities, make sure it is a version worth saving. Who knows, they might then just save themselves.

Gary Saul Morson is Lawrence B. Dumas Professor of the Arts and Humanities at Northwestern University. Morton Schapiro is a professor of economics and the president of Northwestern University. This piece is based on their newly published book, Cents and Sensibility: What Economics Can Learn From the Humanities (Princeton University Press, 2017).

Editorial Tags: 
Image Source: 
iStock/FingerMedium
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Philosophy professors at St. Thomas in Houston, their contracts late, fear for their jobs

Smart Title: 

St. Thomas in Houston has held back reappointment notices for philosophy professors -- even those with tenure -- amid debates over budget and the curriculum.

Florida lawmakers look to cut remediation and cap bachelor's degrees

Smart Title: 

Florida’s Legislature looks to shake up the state’s two-year college system once again by cutting funds for developmental education, capping bachelor's degree programs and creating a new oversight board.

Comparative class at Grinnell focuses on migration and border policy in U.S. and Europe

Section: 
Smart Title: 

First-year Grinnell students travel to Germany, Greece, Mexico, Spain and the U.S. Southwest for comparative class on migration, borders and refugees.

Colleges start new academic programs

Smart Title: 

The questionable entrepreneurship mania on college campuses (essay)

Over the last decade or so, universities around the country have been tripping over one another to see who can slap the word “entrepreneurial” on the most things on their campuses the fastest. Entrepreneurial studies curricula and majors, business incubators, entrepreneurial centers, on and on -- entrepreneurial efforts have sprung up faster than the “innovate and disrupt” start-ups scattered about Silicon Valley whom they seem to desperately want to imitate.

This “Silicon Valleyization” of the university can be seen in places like Florida State University, which recently received a record $100 million to open the Jim Moran School of Entrepreneurship, or at Rice University, which last year announced the formation of an “entrepreneurial initiative” to transform the university into an “entrepreneurial university.” Other institutions -- such as Emerson College, the University of Hartford and the University of Massachusetts at Lowell -- have also joined the entrepreneurial arms race with their own centers and curricula.

For advocates of the entrepreneurial university, such moves mark a more full alignment of higher education with the needs of the new economy. Universities are finally recognizing the central role that they now must play in spurring “endogenous growth” in the highly competitive global market, where innovation determines national, state and individual winners and losers. They are finally coming down from the ivory tower and lining up their curricula and research to meet that need. For critics, however, such developments represent yet another chapter in the capturing of the university by particular economic interests -- and a further loss of autonomy and intellectual integrity, as institutions mindlessly chase the latest fad and buzz meme.

While the entrepreneur as a particular type of economic actor in the market economy has been around for some time, entrepreneurialism as a full-blown social and cultural movement is much newer. If we situate entrepreneurialism as a historically distinct social phenomenon, or perhaps as a post-Bretton Woods economic model, it contains several assumptions about society, politics and markets that largely go unacknowledged in the frenzy to create the entrepreneurial society and the enterprising university to accompany it.

First is the profound shift from a more organized style of the market economy -- with large corporations, unionized labor, slow growth, steady-state capital and a welfare-oriented state -- to a more disorganized one composed of start-ups, flexible labor, erratic growth, impatient capital and a market-oriented state. In the newer, churning model of the market economy, the entrepreneur -- personified in cultural and political heroes like Donald Trump and Mark Zuckerberg, rather than the corporate manager or professional -- becomes the central new cultural icon.

One of the things that this unceasing push for entrepreneurial innovation as a driving force of economic growth dismisses or ignores is the actual destructive part of creative disruption. Creative disruption seems fine as long as it is other people whose lives are disrupted rather than your own. This view is often callously unconcerned with the harm that can be generated by disruption for the sake of disruption or the mantra that all innovation is progress. Here, in the mold of the economist Joseph Schumpeter and the Harvard University management gurus Clayton Christensen and Michael Porter, all disruptions are ultimately positive and all innovations are advancements. The market will miraculously, fairly and brutally sort out any lumps in the end. What disrupters yearn for is an always roiling and never resting society generated by people in continuous struggling with one another to provide the next best thing and “strike it rich.”

As Virginia Heffernan recently described it, such innovators compose a “sneakered overclass -- whose signature sport is to disrupt everything, from Ikea furniture to courtship.” They embody the religiously inspired dream of heavenly redemption, the modernist desire of continuous progress and “lotto fever” rolled into one. It is unclear, however, how far such a model can actually extend. How much innovation and disruption does the world need? Or, more important, how much can it actually take?

Second, entrepreneurialism as an idealized economic model promotes a rather distinct type of asocial, social Darwinist, “go it alone” mentality where the single, self-interested individual is seen as solely responsible for his or her successes. Here, even when philanthropy happens, it is ultimately designed for self-interest, as with Mark Zuckerberg’s Chan Zuckerberg Initiative LLC.

Even with the rhetoric of Google-style teamwork aside, the entrepreneurial model celebrates the ideal of the lone-wolf innovator who works hard, charts their own course and “defies the odds.” It is the American mythology of rugged individualism recast for the jobless age of the precariat, forever-flexible labor and the post-welfare state.

In doing so, this new economic model passes off social inequality as just the normal and inevitable ebb and flow of winners and losers in a free-flowing economy that is in constant flux -- and one that people need to adapt to rather than try to change. If you work hard enough, innovate and adapt to the market, you are entitled to reap the rewards. Those who cling to the collective protection of unions or change movements, or even the left-behind world of tradition, are but mindless sheep who lack the imagination to think for themselves and adapt. If you fail in your endeavors, you need to readapt and reinnovate in order to make your way again.

As on the TV show Shark Tank, the swirling and hungry accumulated venture capital of those “who have already made it” is there waiting to provide for newbies with the right stuff. Surviving and prospering are strictly by your own fruition. Yet all this ignores not only the highly likelihood of failure in these start-ups (90 percent, according to Forbes magazine) but also the social costs of living in a world composed of a handful of wealthy winners and scores of poor losers chumming up the shark-tank economy.

Third, implicit in the romantic idealization of the entrepreneur is the neoliberal idea of a limited pro-business government. Rather than expecting government to level things out a bit through progressive taxation or some other modest modes of redistribution -- or through various social services such as public education -- the new economic model promotes a government that is entrepreneurial, too. This enterprising government doesn’t protect people from the market as in the social democratic model but rather forces even more marketization onto them.

People must be coerced (or, in the more polite terms of behavioral economics, “nudged”) by government to “have grit and determination,” “manage their own retirements and health care,” “have positive affect” and “be responsible.” They must be calculating, self-interested and self-promotional, even if they don’t want to be. Responsibility will, in the words of former British Prime Minster David Cameron, finally force people “to ask the right questions of themselves.”

What all this means is not that entrepreneurialism is necessarily a bad thing when taken in moderation and seen within the light of a larger political economy. We can certainly acknowledge the important contributions of the loads of small businesses and innovations built on entrepreneurial principles. But an entire society or university based solely or largely on those principles is rather problematic and limiting.

Universities should be leery of aligning their curricula and research just to meet the needs of the entrepreneur. It is one thing for a higher education institution to recognize entrepreneurialism as one particular economic form but quite another to become an entrepreneurial university. Universities are -- or should be -- like the economy and society themselves: too multidimensional to remake themselves into any one particular cause of the moment.

Steven C. Ward is a professor of sociology at Western Connecticut State University.

Image Source: 
Getty Images/Jim Spellman/WireImage
Image Caption: 
TV personalities Mark Cuban, Barbara Corcoran, Daymond John and Kevin O’Leary Discuss ‘Shark Tank’ at Build Studio.
Is this diversity newsletter?: 

Amid enrollment declines, speakers consider the shape of the English major

Smart Title: 

Amid enrollment declines, speakers at Modern Language Association discuss shifts in the major, such as a de-emphasis of traditional survey and the addition of more writing-related courses.

Amid turn toward nationalism, global educators consider their work

Smart Title: 

Amid turn toward nationalism, global educators consider their work.

Pages

Subscribe to RSS - Curriculum
Back to Top