the lyf so short, the crafte so long to lerne
--Geoffrey Chaucer, “The Parliament of Fowls”
Let’s begin with the Ivy League-educated Barack Obama: “But I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree… I'm just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.” Apparently what was good enough for him is no longer good enough for factory workers in Milwaukee, Wisconsin (where he delivered that speech — though, to be fair, he did go on to apologize for the remarks).
And, of course, President Obama is not the only public figure who has the liberal arts in their sights. Governor Patrick McCrory of North Carolina made it clear that if he had his way the State of North Carolina would fund only the sort of education he deemed practical: "If you want to take gender studies that's fine, go to a private school and take it. But I don't want to subsidize that if that's not going to get someone a job." His Republican colleague in Florida, Rick Scott, was equally blunt: "If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so." (All quotations are from Inside Higher Ed.)
The liberal arts are taking it on the chin and, since they were on their knees anyway, they have been an easy target. Over and over the voices raised against the liberal arts (and the humanities wing of them in particular) complain that they leave their students ill-prepared for gainful employment; that focusing on the liberal arts prevents students from studying the subjects they and the nation truly need developed; that they are for the idle wealthy (a particularly sharp-edged version of these arguments is available at the blog of the American Enterprise Institute, “Harvard, We Have a Problem: Too Many Liberal Arts Majors”).
Apparently, people have been listening. The evidence has been clear for some years that the liberal arts and especially the humanities side of them are fading from the cultural scene of 21st-century America. One study found that, since 1990, 39 percent of colleges identified as liberal arts colleges have vanished. Another study found that humanities majors now constitute fewer than 10 percent of all college majors in the U.S.
Of course, nothing lasts forever, so why should the liberal arts? “All things must pass,” George Harrison sang all those years ago, and even Shakespeare, that centerpiece of many a liberal arts curriculum, in one of the sonnets that seemed to claim immortality for poetry, recognized that his art is term-limited, concluding his wonderful Sonnet 18 with this couplet qualifying the shelf life of art: “So long as men can breathe and eyes can see, / so long lives this and this gives life to thee.” There will come a time, that couplet acknowledges, when no men breathe and there will be no eyes to see. To everything there is a season and perhaps the season of the liberal arts has turned.
So if the liberal arts are sinking into enervated senescence, are passing the way of all the generations, I would like to linger for a few moments looking back over my life to muse on why I have spent the last four decades deep in the liberal arts, that is, on why the liberal arts mattered. Not that my life has been all that interesting (or, at least, not that my life would be interesting to anyone else), but the liberal arts are all that interesting and I would like to gesture toward that interest by way of my experience, as a way to suggest what we may all too soon be missing.
It all really did begin for me in a lecture hall in the old Main Building at New York University, on the east side of Washington Square Park. Dingy, drafty, somewhat grimy, windows smeared with the grease of years of students within and exhaust and smoke without. Wooden seats scarred and discolored and often cracked. The course was “Primitive Oral Heroic Poetry,” and the professor was the late Jess B. Bessinger, Jr. The reading list included Gilgamesh, Homer, The Book of Dede Korkut, The Song of Igor’s Campaign, Bantu warrior poetry and Beowulf. It was the Anglo-Saxon poem that prompted the performance that determined my life. Professor Bessinger had been describing the poetics of the Anglo-Saxon verse and especially the power of the alliteration that is a central feature of that verse, when he paused in his lecture to dwell on the strength of the linked words, to suggest to us that alliteration could still be a powerful tool in the hands of a master poet. And he proceeded to recite, to intone really, from memory a section of Tennyson’s In Memoriam that concludes with a particularly thrilling use of alliteration:
Dark house, by which once more I stand
Here in the long unlovely street,
Doors, where my heart was used to beat
So quickly, waiting for a hand,
A hand that can be clasped no more —
Behold me, for I cannot sleep,
And like a guilty thing I creep
At earliest morning to the door.
He is not here; but far away
The noise of life begins again,
And ghastly through the drizzling rain
On the bald street breaks the blank day.
Bessinger had a magnificent deep baritone and he spoke those lines as if they were coming from the center of his being — modulating and pausing and letting the emotional sense of the words linger out the vowels across the metronomic pressure of the metric pulse, coming to those last consonants with a devastating finality that rendered perfectly the desolation Tennyson’s words evoke.
At that moment I determined that I wanted that experience, that I wanted to live in words so deeply that they would become so a part of me that I could summon them immediately and without premeditation. I wanted to know a poem so well that it would be with me whenever I wanted or needed it. And it was precisely because those words of Tennyson’s in the voice of Bessinger so feelingly captured the experience of grief — the world going on outside the grieving consciousness of the bereft does seem “ghastly” — those words so beautifully rendered the individual experience and thereby provided a kind of general access that any and all could share, that those words did that much was what I intuited, what I felt in my marrow, at that moment in the silence of the stunned classroom (at least in my memory all of us sitting in that faded lecture hall shared the sense of awe in the presence of a poem coming to life in the air). Although I had not yet experienced the sort of grief out of which Tennyson’s poem grew, I knew then that it had a shape and a sound and that when that sort of grief did descend on me I would recognize it.
I had been infatuated with certain poems before, mostly poetry I had seen mentioned by my heroes (the Beatles and Bob Dylan particularly): Ginsberg, Whitman, Blake were the main ones. And late in Richard Nixon’s first term I came across “The Hollow Men” and thought it spoke directly to the world being mangled in plain view. But Bessinger’s summoning of the spirit of Tennyson’s poem in the mingled air of that Main Building lecture hall determined for me the course of my life, determined that for the next 40 years (and, no doubt, for the years remaining to me and my memory and mind), poetry, stories, plays — Literature (with the upper-case to designate my reverence for "the best words in the best order") would be the central obsession of my consciousness.
So what has this obsession given me? I am not wealthy, though my family and I live far better than most of the people with whom we share the planet. But wealth was never my object. All the bromides that are generally marshaled on behalf of the liberal arts clamor for attention here. Critical thinking; tolerance; flexibility of mind; problem-solving; and the rest of them that sound so vacuous up against the voices we heard at the beginning of this essay. Yes, I suppose, I do think more critically than I would have had I never taken English and philosophy and political science and psychology … all those classes that constituted my undergraduate liberal arts education. I am certainly more aware and tolerant of differing views. I am certainly more aware of different cultures and different times and places and peoples from the people, places and times among which I have lived. And it must be admitted that whatever critical thinking and tolerance and recognition I have been able to practice have been practiced, have been honed, have become habitual to my way of being and those habits were planted in those long-ago classrooms on the edge of Washington Square Park.
But those habits aren’t why I have remained immersed in the world of words and ideas. And those habits, thankful as I am to have them, are not what kept me in those classrooms in the first place and are not what have kept me in their long, long stretching, encompassing aura since. The real reason is pleasure. The pleasure of having my mind tickled into action by the vibrations of words sprung into patterns “where more is meant than meets the ear.” The pleasure of having within my reach congeries of words that render a life, that render living, more completely and more profoundly and more compassionately than hours of my groping for my own formulations could ever hope to achieve. I can’t tell you how often, confronted by a student, a colleague, an adult acquaintance whose ways of being in the world have clearly been marred by something in the past, how often in such moments Larkin’s supremely packed line has come to mind: "an only life can take so long to climb clear of its wrong beginnings and may never." I’m not sure how English speakers have managed for all the centuries of our language without that line.
Unaware of what President Obama would discourage years later, I did take an art history class once. After four decades I’m not sure how much I remember beyond a detail here and there. Our textbook was Gardner’s — or was it Hansen? It was red and large (as large as the Riverside Shakespeare that I also had to haul around that semester — that I do remember). Did I learn critical thinking in that class? Among the defenses mounted on behalf of art history in response to Obama’s dismissal was the usual: Art history teaches critical thinking. Among the details I do remember from that course is that I learned how to look at paintings from the 15th century, one painting in particular. That course taught me really to see Bellini’s "San Francisco nel deserto." And I was fortunate that I lived in the city where the Frick sits and so Bellini’s painting was available in all its magnificence whenever I could make my way to the Upper West Side (with a student ID, the suggested entrance fee was minimal if not waived).
I learned that beyond the shimmering magic of the light and shade and nuances of light and shade Bellini deploys across the canvas, and beyond the minute detail of the natural world surrounding the enraptured saint, beyond or really within all of that splendor the painting speaks in a series of languages that course taught me to hear, as it were. The rabbit poking its head out of the lower corner of the canvas, the donkey standing patiently, the long-legged shore bird, the cracked rock, all of these perfectly captured natural objects carry meaning in a register beyond the surface register of accurate detail. And that course taught me to look for those kinds of meaning. That course deepened my experience of that painting and, as a consequence, of all painting.
This is, I suppose, critical thinking. Once you begin to see linear and atmospheric perspective and chiaroscuro and all the technical arsenal whose names I’ve forgotten but whose presence I’ll never forget … once you learn to see you can look and see a great deal more than what immediately meets the eye. If that is what the art historians mean by critical thinking, they should declare it. Because it is valuable precisely because it deepens one’s pleasure in the world we share. And that is what the liberal arts do. They are life-affirming, life-enriching, indeed, life-enabling forms of human engagement with the world (in addition, of course, to their indispensable value as preparation for any number of successful career tracks). Especially at this time in the history of our culture, we must champion the liberal arts as modes of being, really, in the world that have the power to transform those who are fortunate enough to experience them into more articulate, more thoughtful, more comprehensively human citizens. The liberal arts provide an education for life.
I don’t think I’m just being idiosyncratically pessimistic to worry about the future of the liberal arts in our culture. And I find myself, as this worry settles itself in my mind, looking back. I have spent what I consider to be many profitable hours reading over the lectures and notes of Thomas Frederick Crane (first professor of romance languages and first dean of the College of Arts and Sciences at Cornell University, where I work), particularly those thoughts of his he committed to paper concerning the college he helped found. And among the aspects of Crane’s reflections I would hope to carry forward into the uncertain future spreading before us are those virtues, values, habits of mind … whatever we call them … those qualities of a liberal arts education I think have been at the core. Perhaps others would name them differently, but here is what I name them: curiosity, generosity, diligence, care, patience — above all, patience. Patience is what Crane meant when he said that a liberal arts education is “a process that for better or for worse will continue as long as our lives, and any scheme of collegiate education will be a dismal failure which does not implant the seeds of later fruitage.”
As I was working on this, I finished rereading The Portrait of a Lady by Henry James. Rereading it constitutes one of the great reading experiences of my life. Beyond its own nearly unfathomable wonder, reading it in the context of writing this and of what this essay gestures toward in the world around us has given the novel an added poignancy for me. It is a novel whose central actions, if actions they can be called, are two: some 390 pages into the novel, a woman, the lady of the title, notices another woman and man in a room, not doing anything, just in the room and the composition the man and the woman make in how they sit and stand carries a profound meaning for the observing woman. Later that woman, goaded into thought by her observation of the other woman and man, will spend an entire night and James will spend an entire chapter describing her night and all she does through that night is to sit in a room thinking while the candles gutter toward dawn, “she leaned back in her chair and closed her eyes; and for a long time, far into the night and still further, she sat in the still drawing-room, given up to her meditation."
That is from the third sentence of the chapter and she does not move from that chair for another 13 pages. Would any novelist, any writer, any film-maker or television producer, would any artist now venture to devote a substantial portion of her or his work to a woman sitting still and thinking? Would any such artist have that prolonged session of sweet, silent thought count as the central action of her work? For that matter, would any of us actually sit in a room in stillness and silence and darkness for hours on end given up to the wandering meditation of our minds? James is thought of as a novelist who adheres to reality, but is such a reality possible for us?
The qualities of the liberally educated that T. F. Crane believed in and that the education he helped create here at Cornell inculcate were, above all, qualities of curiosity and patience, circumspection and attention, what I gather some now call mindfulness, a useful word in my taking of it to mean: having your mind at full play in its engagement with the world. No form of education yet devised is better at bringing the mind to the fullness of its capacities than the education offered in the liberal arts. Without the patience instilled by immersing oneself in the mind-stretching range of the liberal arts, we are reduced to jittering appendages to the plastic devices in our hands, dried leaves scattering to the whims of market and fashion, addicts to money and status and consumption. Without the liberal arts how will we ever in our information saturated and buzzing stimulated overloaded reality actually sit still long enough to hear our own minds at work?
David N. DeVries is associate dean for undergraduate education in the College of Arts and Sciences at Cornell University.
Some will immediately say this is nothing more than a semantics debate. No different than if we were discussing the contrasting meanings of, say, “soda” and “pop.”
When we use the word “pedagogy” as a catchall for all teaching methods, of course, no one is talking about little children, but we rarely stop and specifically consider what this word means and its relationship with other words.
Pedagogy: the methods and practice of teaching children.
Andragogy: the methods and practice of teaching adults.
So the question becomes: at what point is a student no longer a child, but an adult? There is no hard-and-fast rule, but for our purposes here, any college student is an adult.
Andragogy, a concept dating to the 1960s and Malcolm Knowles, is important because it recognizes that adult learners are different and that these differences are extremely important. And its importance, as a body of knowledge and approach in and of itself, is profound and vastly under-recognized.
Andragogy -- adult learning theory -- stresses that adults:
Are more independent than children when it comes to learning.
Are capable of critical thinking (unlike some children) but are still interested in the “correct answer."
Learn more slowly but just as effectively because they have more life experience and deeply ingrained stereotypes and ideas.
Must be given respect as adults and for their life experience or lack of experience.
Need classrooms that embrace active learning, including hands-on activities.
Learn material that is relevant for their needs.
Are driven less by grades (performance goal orientation) and more by understanding (mastery goal orientation).
Going back to the question of when students become adults, in some ways it does not matter per se. All learners learn best when many of the core elements of andragogy are followed. All students — whether 5, 15 or 55 — deserve respect, need room for their prior experiences, and need lessons to be relevant. That said, the idea of andragogy exists on a sliding spectrum of sorts. Whether a student is 18 or 85, he/she will enter the classroom with experience, for example, but this experience will vary based on age, interests, background, etc.
This is also where some understanding of basic human growth and development theories (e.g., Maslow’s hierarchy of needs, Erikson’s stages of psychosocial development, Piaget’s stages of cognitive development) can help professors build classrooms that are comfortable across the board. Students in their 30s will tend to have very different biologically driven needs, hopes, and fears than students in their 60s.
When students are not allowed opportunities for their feelings, especially about particularly sensitive topics or topics to which they have been vastly miseducated or undereducated, learning stops. (Please see my comments about the trigger warning or objectionable material warning and student feelings here.) Additionally, we know that for learners of any age it is very hard, even physiologically impossible without extreme dedication, to “unlearn” what have been “core truths,” whether the topic is basic physics or the causes of the Civil War.
This said, pedagogy is still important because children do learn differently and have different needs. Most notably, children need some more guidance. Likewise, children — depending on their age and experience (back to the sliding spectrum) — are physiologically not always capable of performing advanced math or demonstrating critical thinking. This is not at all to sanction the “banking method” — where teachers only lecture, metaphorically dumping information into students’ brains and then students regurgitate that information verbatim on assessments — of education that has sometimes been all too common: Active learning and student-centered learning is always best.
One note on learning styles, too: adults do tend to think they have a learning style — visual, kinesthetic, auditory — that enables them to learn more effectively. While I have read much more about andragogy than learning styles, there is some research that suggests learning styles are actually a myth. They have relevance because we give them relevance, but actually it is roughly equally possible for learning to happen visually or kinesthetically, for example, and furthermore, that ALL learners learn best when all learning styles are used. Going back to Bloom’s Taxonomy: learning that involves interactive thinking, hearing, reading, writing, touching, and creating results in the most effective learning, and naturally, much of this will requires independent learning and initiative by an adult student.
Even if we recognize that adults learn differently from children, by using the umbrella term “pedagogy” for both, we unconsciously tend to view adult learners as “children” who need to be taught by the “expert,” and we miss an entire body of knowledge and research about effectively teaching. I know some professors do not like the idea of being taught how to teach — they say it sounds too much like the training required to teach K-12. I too was somewhat like this when I first started teaching college in 2007.
But, as professors in the classroom, our ultimate goal should be for our adult students to learn, and for learning to occur, we should always be aware of how to teach effectively and stay reasonably up-to-date on findings as they develop.
For further information on andragogy check out this website; Malcolm S. Knowles’s The Adult Learner (now in its seventh edition); and Sharan B. Merriam, et al.’s, Learning in Adulthood: A Comprehensive Guide.
Andrew Joseph Pegoda is completing his Ph.D. in history at the University of Houston, where he also teaches. He studies race, culture, human rights, and education. He regularly blogs here.
In order for colleges to function as inclusive communities of responsible and respected members, all of their adults must be treated as adults. Yet, many of my faculty colleagues habitually call their undergraduates “kids” by default. They should stop. In addition to usually being false, it is demeaning and it tacitly encourages the immature behaviors we all bemoan.
When undergraduates begin college they immediately receive warnings that high school is over and that they will now be held to adult standards of conduct. Meanwhile, hallways are filled with faculty and students talking about which of their classes have especially “good kids,” “quiet kids” or “lazy kids.” In our speech, undergraduates are demoted back to children — they are infantilized. The resulting mixed messages would confuse anyone. Undergraduates are held to high behavioral standards (“I have zero tolerance for accidental plagiarism. A college student should know better.”). At the same time, they are spoken of as children (“The kid who plagiarized in my class is asking for leniency.”).
In my state and most of the U.S., we formally recognize 18-year-olds’ right to make autonomous choices while also being held accountable for a full set of societal responsibilities. Eighteen-year-old men and women begin college having recently earned the right to sign contracts and take full responsibility for the consequences; those who are U.S. citizens have recently earned the right to vote and the duty to serve as jurors; most of the men have completed their mandatory registration for Selective Service in case a military draft is ever reinstated. These men and women who are undergraduates live with the adult consequences of their adult rights and responsibilities when they get tattoos, decide whether to seek mental health treatment, get married, sign up for credit cards and so on.
What about those still-developing young adult brains? In contrast to the rigid law, developmental psychology research paints a complex picture of how traits gradually develop over time, with features such as “psychosocial maturity” varying substantially from person to person within an age group. Appealing to the developmental psychology literature will not justify the decision to walk into a lecture hall filled with young adults one scarcely knows, each at variable stages of development for a wide array of psychological and behavioral traits, and say, “Quiet down, kids!”
By publicly referring to undergraduates as “kids,” faculty members unwittingly invite childish behaviors. Kids ask their parents to call the instructor about a bad grade. Kids whine that they were not reminded about the homework that was due. Kids giggle when a peer shares an embarrassing personal story during class. Kids make inappropriate jokes to get a laugh from the room. These behaviors then become perceived justifications for continuing to see undergraduates as kids. The vicious cycle perpetuates the behaviors that faculty members wish to prevent. You’ll have to take my word for it, but my undergraduate students do none those childish behaviors. They act like the adults they are. I contend that the key to achieving this is the radically intuitive strategy of treating them like adults.
If there is one thing I have learned from teaching controversial philosophical subjects (e.g., the ethics of health care policy) to undergraduates, it is that a good classroom environment is the product of an explicit and consistently applied ethos. On the first day of class I tell my students that I will treat everyone in the room as adults whose contributions are valued, and that I expect them to do the same. They are not allowed to use the words “kid,” “idiot,” “bleeding heart,” or any other disparaging language to describe each other, as this is incompatible with a classroom that is inclusive of its diverse members. In a recent course evaluation from a senior seminar, a student expressed gratitude that I did not treat the class members as “inferiors.” It upsets me that such a thing bears mentioning. A roughly 22-year-old man or woman was so accustomed to being treated as a child or a second-class citizen that he or she felt obliged to mention it when treated otherwise.
Thinking of and speaking of undergraduates as “kids” can manifest in class policies ill-suited for adults. Perhaps the clearest examples of this are some of the faculty responses to poor undergraduate behavior. There is undeniable appeal in some of my colleagues’ approaches, such as publicly shaming students caught looking at Facebook in class or confiscating any cell phones used for texting during a lecture. However tempting it might be, this is not appropriate behavior between two adults. This is how an adult treats a kid.
If a dean did such things to faculty members during meetings then he or she would rightly be called a tyrant (and would likely have a large collection of cell phones). Strategies responding to an adult’s childish behavior must work within a framework of adult-adult interaction. If students use their cell phones in class then the instructor can easily initiate a brief classwide conversation about the classroom policies and penalties, as well as the reasons for them. An instructor can also speak candidly and politely with an individual student after class ends about any violated policies.
Every adult has moments of childish behavior. It is one thing to criticize an individual adult for a specific childish behavior, but quite another thing to indiscriminately call a whole group of adults “kids.” There are indeed cases where it might be appropriate to refer to an individual student as a “kid” or “child,” much like it occasionally might be appropriate to refer to an individual student as a “jerk.” Faculty members need to privately grumble and blow off steam just like anyone else — call it the Happy Hour Exemption. This does not make it acceptable to use “kid” (or “jerk”) as one’s default term for undergraduates. Even when used as a term of endearment, “kid” still devalues undergraduates as autonomous agents. It is no more appropriate than saying “good boy” to a graduate student who wrote a strong paper, or describing a junior faculty member as a “nice girl.”
Whether they grew up listening to the Everly Brothers or the Jonas Brothers, adults deserve to be spoken of and treated as respected and accountable human beings. Many undergraduates are new adults, and unsurprisingly most are not yet very good at acting like adults. This does not excuse faculty members who casually refer to these men and women as “kids.” In anything, the infantilizing language sends the misleading message that undergraduates are permitted to act like children. Unfortunately, the undergraduate-as-kid mindset is deeply ingrained in campus culture, making change difficult. We even have the audacity to reserve the term “adult learners” for undergraduates over the age of 25. This status quo is unacceptable. The adult men and women in our undergraduate courses deserve better.
Sean A. Valles is assistant professor in the Lyman Briggs College and department of philosophy at Michigan State University.
The teaching of introductory economics at the college level remains substantively unchanged from the college classroom of the 1950s, more than 60 years ago. The teaching of other introductory courses, from psychology to biology, has changed dramatically -- with new knowledge and, more importantly, new pedagogical techniques. Today's students are also very different, not accustomed to sitting through 50-minute lectures, taking detailed notes of material and techniques, the value of which has yet to be demonstrated to them.
Thus, it is little wonder that more students do not elect introductory economics or, following the course, do not take more economics. Grades tend to be lower in introductory economics, discouraging many students from taking additional courses. The concern is paramount. In today’s complicated world, the design of sound policy requires an understanding of economic principles. Yet, so many who are deciding on policy, particularly voters, as it is they who elect policymakers in democracies, are frequently ignorant of economic principles. Now many students do major in economics, but frequently for what is perceived to be enhanced employment practices in the business world. Love of the study of economics does not seem to be manifested by many students. If a school has an undergraduate business major, the number of economics majors fall precipitously. Fewer and fewer graduates of liberal arts colleges go on into economics Ph.D. programs that are increasingly populated by very able international students.
Also, our traditionally underrepresented groups are truly underrepresented as students of economics. Women, though more than a majority of today’s college students, still shy from economics, as shown by a recent study done by Professor Claudia Goldin of Harvard University. African Americans and Latinos also are not well-represented in college economics classrooms. Why? Different hypotheses, each of which probably has some significance, include a particular alienation from the teaching methods, lack of role models in the classroom, difficult material and low grades combined with the additional challenge of being a minority student or a first-generation student. Also, normative issues such as poverty and discrimination are frequently marginalized, reducing the relevance of the course to many students.
Recently I chaired a meeting that had faculty members from over 60 undergraduate economics programs to discuss both Advanced Placement economics and the future of the college introductory course. There was consensus that the course seemed to be structured and taught, consistent with the first edition of Paul Samuelson’s famous and dominating textbook, Economics: An Introductory Analysis in 1948 (to continue toward 20 editions!). Texts and the course seem to mirror the major theoretical components of basic microeconomics and macroeconomics. Much has been added (e.g., game theory and rational expectations) with little subtracted (maybe labor unions and Keynesian fixed price case). The result is a rather encyclopedic textbook with 30 or more chapters.
Faculty race through as much of this material as is possible. With such breadth of material, depth is frequently sacrificed. Students are more frequently memorizing and not as frequently learning. Grades tend to be among the lowest in introductory college courses and student satisfaction highly variable. Most distressingly, students are not necessarily learning to think like economists and understand the power of economics as an explanatory tool for human behavior.
Thus, there is momentum to address the deficiencies of this extraordinarily important introductory course. More faculty are aware of these problems and recognize some lack of student enthusiasm. The College Board has initiated the discussion that I mentioned with groups conveyed to look at both introductory microeconomics and introductory macroeconomics. Under the guidance of respected economist John Siegfried (of Vanderbilt University) and a blue-ribbon committee of university economists, the National Council for Economic Education has developed 20 standards for economic understanding and literacy, applicable to differing levels of education. Textbook companies are now offering customized books so that a faculty member need not present 30-plus chapters to a student, but rather can customize 10 or 20 that will form the basis of a streamlined course, one in which students can truly learn economic concepts.
With such positive momentum, will the worthy objective of a newly inspired and improved economics courses become a reality any time soon? Obstacles still exist. Tradition and lethargy can be powerful brakes on new methods and ideas. Also, a course with less breadth means the elimination of some topics. Which will they be? For some economics faculty, labor markets can be eliminated; for others, labor markets form the heart of both microeconomics, and certainly macroeconomics. Yet the payoff is potentially so high.
From my perspective, students who take introductory economics should complete the course with some understanding of 1) why income inequality exists and how to address it, 2) the means by which negative externalities like pollution should be addressed, 3) international economic exchanges can be mutually beneficial, 4) what were causes of our most recent Great Recession, 5) how to address long-term unemployment, and 6) the causes of global inequality. Such a course would be of greater interest to our students in 2014.
In a world full of excruciatingly complex and dangerous problems (from income inequality to environmental degradation), economics, as a discipline, must be a central player as orderly resolution is sought. As mentioned, students might actually study in an economics course the causes and consequence of the Great Economic Recession of 2008. Today, such a topic is often too esoteric and not part of the mainstream cannon of economics. A generation of students, from varying backgrounds and experiences, should be taught to appreciate and even admire the power and the logic of economic analysis. Parents, students, and voters, you all must help to ensure that this opportunity for important educational analysis is not lost.
Clark G. Ross is Johnston Professor of Economics and dean of faculty emeritus at Davidson College.
Telltale fast clicks of laptop arrow keys gave away my distracted student from 30 feet off.
So engrossed was he in a 1980s role-playing game that he barely noticed when I leaned in to whisper how entirely inappropriate his behavior was during my digital humanities class at Dartmouth College. As a noted visiting technology and culture speaker held forth on participatory culture and Wikipedia — in which my students had expressed an avid interest — I was shocked as he and many others openly engaged with their Facebook pages.
Why would he play a game in a class he insisted he enjoyed? He had been playing the game before he went to bed, so when he opened his computer in class, "the game was just there, so I started playing," he explained to me during office hours. He didn't intend to check out from a class he likes.
But he did.
This incident is particularly ironic as we had discussed in the previous class the myth of multitasking and dialogued about the Stanford University professor Clifford Nass. Now, some class members can handle themselves with their technology; about a third cannot. Multitasking makes us poor learners, studies show. It not only hurts the perpetrator by splitting their focus and attention, but it hurts those sitting around the multitasker and lowers everyone’s overall performance on each task. While millennials may think they have better multitasking chops than older generations, data show this assumption to be false. Unfortunately, science tells us the human brain is not meant to multitask and succeed — even if people truly believe they are good at it.
As a digital humanities professor, I spend many waking hours communicating and creating on a computer. I deeply understand the exciting possibilities offered by digital tools because I spend a great deal of time designing them: in fact, I’ve designed a class that minimizes lecture time to create engaging activities involving mapping, making diagrams and hosting debates on virtual experiences. I also know there's a time and place to engage with actual people. There is a rising concern among faculty members across the country on the need to investigate classroom culture regarding technology so students might actually benefit.
Clearly, the lure of the laptop is too compelling to resist.
Some people might say we should use technology for activities that “flip” the traditional lecture class. In line with this thinking, I have run hands-on activities one to two times a week with great success. During those activities, students are focused and use technology to further learning.
But most days, there will come a time where faculty or guest speakers actually speak, or dialogue happens or provocative points are raised. It is then that students with technology-control issues immediately check out and check into Facebook or online games or shoe shopping. Unless they are directly involved in a hands-on activity for which they will be accountable in public by the end of class, it is much easier to give in to the presence of technology and lose the experience of direct engagement.
Is this chronic narcissism? Or is this phenomenon a desire to escape the confines and taxing nature of concentration? Does this "checking out to check in" represent an insatiable need for immediate news in the "fear of missing out"? In 2013, an international team of researchers designed a way to measure fear of missing out and found people under age 30 were more affected, as were those who reported low levels of connectedness, competence and self-autonomy. This research supports earlier findings that the lonely and bored were more apt to rely on social media and feel left out if they miss out.
Yet students also miss out if they cannot listen and engage with the world in front of them. The presence of technology in a class may not work unless the active learning is active all of the time or, I would suggest, our culture changes such that the live, in-person exchange is more valuable than whatever the student is doing with his or her own technologies.
In higher education, themes of dialogue, listening and presence are a core part of the college experience. We know from research that employing "embodied cognition" -- that is, learning from all of the senses-- is a more holistic and effective way to learn. There is so much to the human experience that is reflected in a face to face meeting, where body language, expression, tone of voice, and others around create a whole-body experience. If higher education continues its mission to support experiential learning, this may mean we must reestablish forms of learning centered back in bodily experience, and lean a little less on technology to transform ourselves.
We need a culture change to manage our use of technology, to connect when we want to and not because we psychologically depend on it. Enough is enough. We need strategies for unplugging when appropriate to create a culture of listening and of dialogue. Otherwise, $20,000 to $60,000 a year is a hefty entrance fee to an arcade.
Mary Flanagan is a distinguished professor of digital humanities at Dartmouth College and a fellow of The OpEd Project.
“What would the United States look like if we really gave up on liberal education and opted only for specialized or vocational schools? Would that really be such a bad thing?”
The interviewer was trying to be provocative, since I’ve just written a book entitled Beyond The University: Why Liberal Education Matters. What exactly would be the problem, he went on, if we suddenly had a job market filled with people who were really good at finance, or engineering, or real estate development?
Apart from being relieved that he hadn’t included expertise in derivatives training in his list of specializations, I did find his thought experiment interesting. Would there be real advantages to getting students to hunker down early into more specific tracks of learning? In that way they would be “job ready” sooner, contributing more quickly to the enterprises of which they are a part, and acquiring financial independence at the same time. Would that really be such a bad thing?
The debate between those who want students to specialize quickly and those who advocate for a broad, contextual education is as old as America itself. The health of a republic, Thomas Jefferson argued, depends on the education of its citizens. Against those arguing for more technical training, he founded the University of Virginia, emphasizing the freedom that students and faculty would exercise there. Unlike Harvard University and its many imitators, devoted to predetermined itineraries through traditional fields, he said, Virginia would not prescribe a course of study to direct graduates to “the particular vocations to which they are destined.”
At Mr. Jefferson’s university, “every branch of science, useful at this day, may be taught in its highest degree.” But who would determine which pursuits of knowledge would prove useful?
Jefferson, a man of the Enlightenment, had faith that the diverse forms of learning would improve public and private life. Of course, his personal prejudices limited his interest in the improvement of life for so many. However, his conception of “useful knowledge” was capacious and open-ended – and this was reflected in his design for the campus in Charlottesville. He believed that the habits of mind and methods of inquiry characteristic of the modern sciences lent themselves to lifelong learning that would serve one well whether one went on to manage a farm or pursue a professional career. It is here we see the dynamic and open-ended nature of Jefferson’s understanding of educational “usefulness.”
His approach to knowledge and experimentation kept open the possibility that any form of inquiry might prove useful. The sciences and mathematics made up about half of the curriculum at Virginia, but Jefferson was convinced that the broad study of all fields that promoted inquiry, such as history, zoology, anatomy and even ideology would help prepare young minds. The utility was generally not something that could be determined in advance, but would be realized through what individuals made of their learning once outside the confines of the campus. The free inquiry cultivated at the university would help build a citizenry of independent thinkers who took responsibility for their actions in the contexts of their communities and the new Republic.
Jefferson would have well-understood what many business leaders, educators and researchers recognize today: that given the intense interconnection of problems and opportunities in a globalized culture and economy, we require thinkers who are comfortable with ambiguity and can manage complexity. Joshua Boger, founder of Vertex Pharmaceuticals (and chair of the board at Wesleyan University), has pointed out how much creative and constructive work gets done before clarity arrives, and that people who seek clarity too quickly might actually wind up missing a good deal that really matters. Boger preaches a high tolerance for ambiguity because the contemporary world is so messy, so complex.
Tim Brown, CEO of IDEO, one of the most innovative design firms in the world, has lamented that many designers “are stuck with an approach that seems to be incapable of facing the complexity of the challenges being posed today.” He calls for a flexible framework that leaves behind static blueprint preparation for “open-ended, emergent, evolutionary approaches to the design of complex systems can result in more robust and useful outcomes.” Like many CEOs across the country, Brown recognizes that more robust and useful outcomes will come from learning that is capacious and open-ended -- from liberal education.
At the Drucker Forum last year, Helga Nowotny, president of the European Research Council, described what she called the “embarrassment of complexity” – efforts based in data analysis to dissolve ambiguity that lead to more conformity and less creativity. She called for an ethos among business and government leaders that would instead “be based on the acknowledgement that complexity requires integrative thinking, the ability to see the world, a problem or a challenge from different perspectives.” That’s a call for integrative thinking based in liberal learning.
In America, liberal education has long been animated by the tension between broad, open-ended learning and the desire to be useful in a changing world. Calls for dissolving this tension in favor of narrow utilitarian training would likely produce just the opposite: specialists unprepared for change who will be skilled in areas that may quickly become obsolete.
So, what would America look like if we abandoned this grand tradition of liberal education? Without an education that cultivates an ability to learn from the past while stimulating a resistance to authority, without an education that empowers students for lifelong learning and inquiry, we would become a cultural and economic backwater, competing with various regions for the privilege of operationalizing somebody else’s new ideas. In an effort at manic monetization without critical thinking, we would become adept at producing conformity rather than innovation.
The free inquiry and experimentation of a pragmatic liberal education open to ambiguity and complexity help us to think for ourselves, take responsibility for our beliefs and actions, seize opportunities and solve problems. Liberal education matters far beyond the university because it increases our capacity to shape a complex world.
We write as faculty members teaching in gender/sexuality studies, critical race studies, film and visual studies, literary studies, and cognate fields. We empathize with the difficulties our students bring into the classroom, from their pasts and/or from their ongoing battles with violence, sexual assault, racism, and other traumatizing events, both everyday and extraordinary. As faculty of color, female, and/or queer faculty, many of us have had some of the same experiences.
However, we are concerned about the movement on college campuses to mandate or encourage “trigger warnings” – notifications that class material may cause severe negative reactions – on class syllabuses. We are currently watching our colleagues receive phone calls from deans and other administrators investigating student complaints that they have included “triggering” material in their courses, with or without warnings. We feel that this movement is already having a chilling effect on our teaching and pedagogy. Here, we outline why a movement with the very salutary intent of minimizing student pain may be, in fact, ineffectual as well as harmful to both students and faculty. We offer this outline in the spirit of collective engagement amongst faculty, students, and administrators because we want to support both faculty in their choice to teach difficult material and students in their need for an ethic of care at the university.
1. Faculty cannot predict in advance what will be triggering for students.The idea that trauma is reignited by representations of the particular traumatizing experience is not supported by the research on post-traumatic stress disorder and trauma. Flashbacks, panic attacks, and other manifestations of past trauma can be triggered by innocuous things: a smell, a sudden movement, a color. There is simply no way for faculty to solve for this with warnings or modified course materials.
An Alternative Point of View
Angus Johnston thinks the concept is a worthy addition to a syllabus and promotes good teaching values. Read
2. There is no mechanism, in the discourse of “triggering,” for distinguishing material that is oppositional or critical in its representation of traumatizing experience from that which is sensationalistic or gratuitous.
3. Most faculty are not trained to handle traumatic reactions. Although many of us include analyses of the cultural logics and legacies of trauma and/or perpetration in our courses, this expertise does not qualify faculty to offer the professional responses traumatized students may need. Institutions seriously committed to caring for traumatized students ought to be directing students, from their first days on campus, to a rich array of mental health resources. Trigger warnings are not an adequate substitute for these resources or for the information students need to get help.
4. PTSD is a disability; as with all disabilities, students and faculty deserve to have effective resources provided by independent campus offices that handle documentation, certification, and accommodation plans rather than by faculty proceeding on an ad hoc basis.
5. Trigger warnings may encourage students to file claims against faculty rather than seek support and resources for debilitating reactions to stressors. In fact, the complaint is implied in the structure of a warning; the warning serves as a guarantee that students will not experience unexpected discomfort and implies that if they do, a contract has been broken.
6. Even the best-intended, ad hoc declarations on syllabuses by individual faculty may lead students to expect or demand similar “disclosures” from other faculty who may feel that other ways of addressing students’ emotional reactions to material are more effective.
7. Faculty of color, queer faculty, and faculty teaching in gender/sexuality studies, critical race theory, and the visual/performing arts will likely be disproportionate targets of student complaints about triggering, as the material these faculty members teach is by its nature unsettling and often feels immediate.
8. Untenured and non-tenure-track faculty will feel the least freedom to include complex, potentially disturbing materials on their syllabuses even when these materials may well serve good pedagogical aims, and will be most vulnerable to institutional censure for doing so.
9. Trigger warnings may provide a dangerous illusion that a campus has solved or is systematically addressing its problems with sexual assault, racial aggression, and other forms of campus violence, when, in fact, the opposite may be true.
10. Trigger warnings may strike some as a cost-effective solution to rising concerns about student mental health, campus cultures that condone sexual assault, and similar big-ticket issues. However, there are hidden costs to a trigger warning policy, for example, the expense, labor, and loss of trust and morale that result from the increased number of Title IX complaints against professionally vulnerable faculty members.
What do we propose as an alternative to trigger warnings? We feel faculty and students are best-served by the following:
1. From faculty -- syllabuses and/or pages on course websites that include referral to on-campus resources available to students experiencing difficulties with course materials in ways that need to be addressed with specific expertise – counseling resources, support groups, advising, relevant student organizations, etc. If such resources do not exist or are insufficiently funded, we believe our efforts should be directed toward establishing and increasing support for them. Mandating trigger warnings should not be a substitute for this important work.
2. From administrators -- systematic, robust, and proactive institutional attention to such matters as sexual assault, racially motivated attacks, harassment, and other practices of violence on campus.
3. From faculty and administrators -- faculty development opportunities that will enhance our ability to recognize and respond appropriately to students’ strong emotional reactions to materials that ask them to witness or analyze violence, question their own privilege, understand their own place in structures of injustice, and undertake other psychologically difficult tasks.
4. From students -- awareness that the faculty who teach the very materials that help them understand and combat racism, sexism, heterosexism, ableism, etc., as well as trauma, violence, and practices of injustice, are often the most vulnerable members of their professional context. Administrations may use student complaints to marginalize particular faculty and particular topics, and/or use a trigger mandate/recommendation to delimit what can be taught in the first place.
Some students may read trigger warnings as evidence that faculty and the university care for them and recognize their histories of trauma. We believe the university has a responsibility to provide that care in the form of appropriate resources and support beyond any statement on a course syllabus. As well-intended as trigger warnings may seem, they make promises about the management of trauma’s afterlife that a syllabus, or even a particular faculty member, should not be expected to keep.
The authors of this piece are:
Elizabeth Freeman, professor of English at the University of California at Davis.
Brian Herrera, assistant professor of theater at Princeton University.
Nat Hurley, assistant Professor of English and film studies at the University of Alberta.
Homay King, associate professor of the history of art at Bryn Mawr College.
Dana Luciano, associate professor of English at Georgetown University.
Dana Seitler, associate professor of English at the University of Toronto.
Patricia White, professor of film and media studies at Swarthmore College.