“What would the United States look like if we really gave up on liberal education and opted only for specialized or vocational schools? Would that really be such a bad thing?”
The interviewer was trying to be provocative, since I’ve just written a book entitled Beyond The University: Why Liberal Education Matters. What exactly would be the problem, he went on, if we suddenly had a job market filled with people who were really good at finance, or engineering, or real estate development?
Apart from being relieved that he hadn’t included expertise in derivatives training in his list of specializations, I did find his thought experiment interesting. Would there be real advantages to getting students to hunker down early into more specific tracks of learning? In that way they would be “job ready” sooner, contributing more quickly to the enterprises of which they are a part, and acquiring financial independence at the same time. Would that really be such a bad thing?
The debate between those who want students to specialize quickly and those who advocate for a broad, contextual education is as old as America itself. The health of a republic, Thomas Jefferson argued, depends on the education of its citizens. Against those arguing for more technical training, he founded the University of Virginia, emphasizing the freedom that students and faculty would exercise there. Unlike Harvard University and its many imitators, devoted to predetermined itineraries through traditional fields, he said, Virginia would not prescribe a course of study to direct graduates to “the particular vocations to which they are destined.”
At Mr. Jefferson’s university, “every branch of science, useful at this day, may be taught in its highest degree.” But who would determine which pursuits of knowledge would prove useful?
Jefferson, a man of the Enlightenment, had faith that the diverse forms of learning would improve public and private life. Of course, his personal prejudices limited his interest in the improvement of life for so many. However, his conception of “useful knowledge” was capacious and open-ended – and this was reflected in his design for the campus in Charlottesville. He believed that the habits of mind and methods of inquiry characteristic of the modern sciences lent themselves to lifelong learning that would serve one well whether one went on to manage a farm or pursue a professional career. It is here we see the dynamic and open-ended nature of Jefferson’s understanding of educational “usefulness.”
His approach to knowledge and experimentation kept open the possibility that any form of inquiry might prove useful. The sciences and mathematics made up about half of the curriculum at Virginia, but Jefferson was convinced that the broad study of all fields that promoted inquiry, such as history, zoology, anatomy and even ideology would help prepare young minds. The utility was generally not something that could be determined in advance, but would be realized through what individuals made of their learning once outside the confines of the campus. The free inquiry cultivated at the university would help build a citizenry of independent thinkers who took responsibility for their actions in the contexts of their communities and the new Republic.
Jefferson would have well-understood what many business leaders, educators and researchers recognize today: that given the intense interconnection of problems and opportunities in a globalized culture and economy, we require thinkers who are comfortable with ambiguity and can manage complexity. Joshua Boger, founder of Vertex Pharmaceuticals (and chair of the board at Wesleyan University), has pointed out how much creative and constructive work gets done before clarity arrives, and that people who seek clarity too quickly might actually wind up missing a good deal that really matters. Boger preaches a high tolerance for ambiguity because the contemporary world is so messy, so complex.
Tim Brown, CEO of IDEO, one of the most innovative design firms in the world, has lamented that many designers “are stuck with an approach that seems to be incapable of facing the complexity of the challenges being posed today.” He calls for a flexible framework that leaves behind static blueprint preparation for “open-ended, emergent, evolutionary approaches to the design of complex systems can result in more robust and useful outcomes.” Like many CEOs across the country, Brown recognizes that more robust and useful outcomes will come from learning that is capacious and open-ended -- from liberal education.
At the Drucker Forum last year, Helga Nowotny, president of the European Research Council, described what she called the “embarrassment of complexity” – efforts based in data analysis to dissolve ambiguity that lead to more conformity and less creativity. She called for an ethos among business and government leaders that would instead “be based on the acknowledgement that complexity requires integrative thinking, the ability to see the world, a problem or a challenge from different perspectives.” That’s a call for integrative thinking based in liberal learning.
In America, liberal education has long been animated by the tension between broad, open-ended learning and the desire to be useful in a changing world. Calls for dissolving this tension in favor of narrow utilitarian training would likely produce just the opposite: specialists unprepared for change who will be skilled in areas that may quickly become obsolete.
So, what would America look like if we abandoned this grand tradition of liberal education? Without an education that cultivates an ability to learn from the past while stimulating a resistance to authority, without an education that empowers students for lifelong learning and inquiry, we would become a cultural and economic backwater, competing with various regions for the privilege of operationalizing somebody else’s new ideas. In an effort at manic monetization without critical thinking, we would become adept at producing conformity rather than innovation.
The free inquiry and experimentation of a pragmatic liberal education open to ambiguity and complexity help us to think for ourselves, take responsibility for our beliefs and actions, seize opportunities and solve problems. Liberal education matters far beyond the university because it increases our capacity to shape a complex world.
A core purpose of remedial education is to provide all students with a real opportunity for college success, regardless of their skill level or academic background. Inside Higher Ed recently published opinion pieces with different takes on the best ways to design remedial programs. This exchange between Stan Jones of Complete College America and Hunter Boylan of the National Center for Developmental Education is a welcome sign. We are concerned, however, that an important consideration has been largely undervalued in the current conversation. Students assigned to remedial education in college are not a uniform group, and the colleges they attend are far from homogenous. Treating them as such masks important differences in opportunity and achievement due to differences in students’ prior academic preparation, incoming skill level, age, race, income and status as first-generation college students.
Students who start in developmental education, particularly those at the lowest levels, face significant obstacles that frequently lead to gaps in educational opportunity and achievement down the road. While there has been considerable rhetoric about the existence of these gaps on the front end, there has been surprisingly little data used to show how the solutions being put forward today would actually address these inequities in the long run. Reform efforts that neglect to address these disparities only threaten to perpetuate them. We support extending the current conversation on reform efforts in developmental education to include four critical considerations:
1. An explicit focus on closing opportunity gaps for students. Opportunity gaps arise when students have different degrees of access to college programs in high school, and these opportunities vary according to a variety of factors, such as school quality and academic preparation. Opportunity gaps are the first step in closing achievement gaps nationwide, yet they are almost never referenced in reports of developmental education reform. Jobs for the Future’s Early College Expansion report provides one example of how closing postsecondary opportunity gaps can be done, and highlights linkages between opportunity gaps and achievement gaps for various groups of students. Starting college while still in high school has been shown to have a significant impact on college enrollment, retention and success for a wide range of student populations. Expanding these opportunities to all high schools and all students, including at-risk students, is one of the most critical steps in closing achievement gaps and fulfilling the completion agenda.
2. An explicit focus on closing achievement gaps for students. The Lumina Foundation recently issued its annual report, A Stronger Nation Through Higher Education, which highlighted persistent college degree attainment gaps by race, with "black adults (ages 25-64) reporting 28 percent degree attainment, Native Americans representing 23 percent, and Hispanics representing with 20 percent attainment, compared to 59 percent for Asians and 44 percent for whites." Also, college participation rates still differ significantly based on income. “While 82.4 percent of potential students (of all races) in the top third of the income scale enroll in college, only 53.5 percent of those in the bottom third do so,” The report said. Jamie Merisotis, Lumina’s president, states, “As the nation’s population becomes increasingly diverse, we must do more to address these troubling attainment divides … We cannot successfully meet our nation’s future economic and social needs unless educational achievement opportunities are available to all Americans.”
3. Comprehensive examples and disaggregated data showing how proposed solutions will address gaps in opportunity and achievement. This information is vital if the chasm between national goals and institutional implementation is to be bridged. Yet these details are notably missing from many national reports and publications. Large-scale solutions require local implementation, and many colleges and programs have little knowledge or information on achievement gaps by race, income status or academic ability for their own students. The 2011 report from MDRC, Turning the Tide: Five Years of Achieving the Dreamin Community Colleges, illuminates this divide with the findings that “overcoming racial, ethnic and income achievement gaps was not a key goal at the majority of Round 1 colleges. Only eight college leaders made explicit attempts to raise awareness about those issues.” As we move forward into an era of reform in developmental education, it is more important than ever to not only acknowledge, but to confront these gaps in educational attainment. Education Trust’s Replenishing Opportunity in America provides helpful examples that show the impact of the solutions on various student groups. This should be the norm when it comes to national reports. Providing these details and data about the proposed solutions will both enrich the conversation and help to gain buy-in of stakeholders.
4. Examples of other successful models. Boylan and Jones both encourage looking to new, innovative models in our efforts to reform remedial education, and we agree. Mastery learning, for example, has been shown to not only close race and gender gaps; it has also been shown to provide a solid foundation for college success. Given the scarcity of examples and data surrounding achievement gaps in the current reports, additional models and examples should be sought out and welcomed into the conversation.
In conclusion, overcoming racial, ethnic and income achievement gaps should be a goal of all American colleges. We cannot achieve equity until we are able to identify and address inequity. Simply acknowledging achievement gaps does not close them. Putting forth models that have actually closed these gaps, complete with details and data, will help to get us there. Using data to illuminate and address gaps in student opportunity and achievement should be the focus of the national conversation and reform efforts in developmental education going forward.
John Squires and Angela Boatman
John Squires is head of the mathematics department at Chattanooga State Community College. Angela Boatman is an assistant professor of public policy and higher education at Vanderbilt University.
We write as faculty members teaching in gender/sexuality studies, critical race studies, film and visual studies, literary studies, and cognate fields. We empathize with the difficulties our students bring into the classroom, from their pasts and/or from their ongoing battles with violence, sexual assault, racism, and other traumatizing events, both everyday and extraordinary. As faculty of color, female, and/or queer faculty, many of us have had some of the same experiences.
However, we are concerned about the movement on college campuses to mandate or encourage “trigger warnings” – notifications that class material may cause severe negative reactions – on class syllabuses. We are currently watching our colleagues receive phone calls from deans and other administrators investigating student complaints that they have included “triggering” material in their courses, with or without warnings. We feel that this movement is already having a chilling effect on our teaching and pedagogy. Here, we outline why a movement with the very salutary intent of minimizing student pain may be, in fact, ineffectual as well as harmful to both students and faculty. We offer this outline in the spirit of collective engagement amongst faculty, students, and administrators because we want to support both faculty in their choice to teach difficult material and students in their need for an ethic of care at the university.
1. Faculty cannot predict in advance what will be triggering for students.The idea that trauma is reignited by representations of the particular traumatizing experience is not supported by the research on post-traumatic stress disorder and trauma. Flashbacks, panic attacks, and other manifestations of past trauma can be triggered by innocuous things: a smell, a sudden movement, a color. There is simply no way for faculty to solve for this with warnings or modified course materials.
An Alternative Point of View
Angus Johnston thinks the concept is a worthy addition to a syllabus and promotes good teaching values. Read
2. There is no mechanism, in the discourse of “triggering,” for distinguishing material that is oppositional or critical in its representation of traumatizing experience from that which is sensationalistic or gratuitous.
3. Most faculty are not trained to handle traumatic reactions. Although many of us include analyses of the cultural logics and legacies of trauma and/or perpetration in our courses, this expertise does not qualify faculty to offer the professional responses traumatized students may need. Institutions seriously committed to caring for traumatized students ought to be directing students, from their first days on campus, to a rich array of mental health resources. Trigger warnings are not an adequate substitute for these resources or for the information students need to get help.
4. PTSD is a disability; as with all disabilities, students and faculty deserve to have effective resources provided by independent campus offices that handle documentation, certification, and accommodation plans rather than by faculty proceeding on an ad hoc basis.
5. Trigger warnings may encourage students to file claims against faculty rather than seek support and resources for debilitating reactions to stressors. In fact, the complaint is implied in the structure of a warning; the warning serves as a guarantee that students will not experience unexpected discomfort and implies that if they do, a contract has been broken.
6. Even the best-intended, ad hoc declarations on syllabuses by individual faculty may lead students to expect or demand similar “disclosures” from other faculty who may feel that other ways of addressing students’ emotional reactions to material are more effective.
7. Faculty of color, queer faculty, and faculty teaching in gender/sexuality studies, critical race theory, and the visual/performing arts will likely be disproportionate targets of student complaints about triggering, as the material these faculty members teach is by its nature unsettling and often feels immediate.
8. Untenured and non-tenure-track faculty will feel the least freedom to include complex, potentially disturbing materials on their syllabuses even when these materials may well serve good pedagogical aims, and will be most vulnerable to institutional censure for doing so.
9. Trigger warnings may provide a dangerous illusion that a campus has solved or is systematically addressing its problems with sexual assault, racial aggression, and other forms of campus violence, when, in fact, the opposite may be true.
10. Trigger warnings may strike some as a cost-effective solution to rising concerns about student mental health, campus cultures that condone sexual assault, and similar big-ticket issues. However, there are hidden costs to a trigger warning policy, for example, the expense, labor, and loss of trust and morale that result from the increased number of Title IX complaints against professionally vulnerable faculty members.
What do we propose as an alternative to trigger warnings? We feel faculty and students are best-served by the following:
1. From faculty -- syllabuses and/or pages on course websites that include referral to on-campus resources available to students experiencing difficulties with course materials in ways that need to be addressed with specific expertise – counseling resources, support groups, advising, relevant student organizations, etc. If such resources do not exist or are insufficiently funded, we believe our efforts should be directed toward establishing and increasing support for them. Mandating trigger warnings should not be a substitute for this important work.
2. From administrators -- systematic, robust, and proactive institutional attention to such matters as sexual assault, racially motivated attacks, harassment, and other practices of violence on campus.
3. From faculty and administrators -- faculty development opportunities that will enhance our ability to recognize and respond appropriately to students’ strong emotional reactions to materials that ask them to witness or analyze violence, question their own privilege, understand their own place in structures of injustice, and undertake other psychologically difficult tasks.
4. From students -- awareness that the faculty who teach the very materials that help them understand and combat racism, sexism, heterosexism, ableism, etc., as well as trauma, violence, and practices of injustice, are often the most vulnerable members of their professional context. Administrations may use student complaints to marginalize particular faculty and particular topics, and/or use a trigger mandate/recommendation to delimit what can be taught in the first place.
Some students may read trigger warnings as evidence that faculty and the university care for them and recognize their histories of trauma. We believe the university has a responsibility to provide that care in the form of appropriate resources and support beyond any statement on a course syllabus. As well-intended as trigger warnings may seem, they make promises about the management of trauma’s afterlife that a syllabus, or even a particular faculty member, should not be expected to keep.
The authors of this piece are:
Elizabeth Freeman, professor of English at the University of California at Davis.
Brian Herrera, assistant professor of theater at Princeton University.
Nat Hurley, assistant Professor of English and film studies at the University of Alberta.
Homay King, associate professor of the history of art at Bryn Mawr College.
Dana Luciano, associate professor of English at Georgetown University.
Dana Seitler, associate professor of English at the University of Toronto.
Patricia White, professor of film and media studies at Swarthmore College.
Trigger warnings in the classroom have been the subject of tremendous debate in recent weeks, but it’s striking how little the discussion has contemplated what actual trigger warnings in actual classrooms might plausibly look like.
The debate began with demands for trigger warnings by student governments with no power to compel them and suggestions by administrators (made and retracted) that faculty consider them. From there the ball was picked up mostly by observers outside higher ed who presented various arguments for and against, and by professors who repudiated the whole idea.
What we haven’t heard much of so far are the voices of professors who are sympathetic to the idea of such warnings talking about what they might look like and how they might operate.
As it turns out, I’m one of those professors, and I think that discussion is long overdue. I teach history at Hostos Community College of the City University of New York, and starting this summer I’m going to be including a trigger warning in my syllabus.
I’d like to say a few things about why.
An Alternative Point of View
Seven humanities professors offer
10 reasons that "trigger warnings"
are counterproductive. Read more.
To start off, I think it’s important to be clear about what trigger warnings are, and what purpose they’re intended to serve. Such warnings are often framed — and not just by critics — as a “you may not want to read this” notice, one that’s directed specifically at survivors of trauma. But their actual purpose is considerably broader.
Part of the confusion arises from the word “trigger” itself. Originating in the psychological literature, the term can be misleading in a non-clinical context, and indeed many people who favor such warnings prefer to call them “content warnings” for that reason. It’s not just trauma survivors who may be distracted or derailed by shocking or troubling material, after all. It’s any of us, and a significant part of the distraction comes not from the material itself but from the context in which it’s presented.
In the original cut of the 1933 version of the film "King Kong," there was a scene (depicting an attack by a giant spider) that was so graphic that the director removed it before release. He took it out, it’s said, not because of concerns about excessive violence, but because the intensity of the scene ruined the movie — once you saw the sailors get eaten by the spider, the rest of the film passed by you in a haze.
A similar concern provides a big part of the impetus for content warnings. These warnings prepare the reader for what’s coming, so their attention isn’t hijacked when it arrives. Even a pleasant surprise can be distracting, and if the surprise is unpleasant the distraction will be that much more severe.
I write quite a bit online, and I hardly ever use content warnings myself. I respect the impulse to provide them, but in my experience a well-written title and lead paragraph can usually do the job more effectively and less obtrusively.
A classroom environment is different, though, for a few reasons. First, it’s a shared space — for the 75 minutes of the class session and the 15 weeks of the semester, we’re pretty much all stuck with one another, and that fact imposes interpersonal obligations on us that don’t exist between writer and reader. Second, it’s an interactive space — it’s a conversation, not a monologue, and I have a responsibility to encourage that conversation as best I can. Finally, it’s an unpredictable space — a lot of my students have never previously encountered some of the material we cover in my classes, or haven’t encountered it in the way it’s taught at the college level, and don’t have any clear sense of what to expect.
For all these reasons, I’ve concluded that it would be sound pedagogy for me to give my students notice about some of the challenging material we’ll be covering in class — material relating to racial and sexual oppression, for instance, and to ethnic and religious conflict — as well as some information about their rights and responsibilities in responding to it. Starting with the summer semester, as a result, I’ll be discussing these issues during the first class meeting and including a notice about them in the syllabus.
My current draft of that notice reads as follows:
Course Content Note
At times this semester we will be discussing historical events that may be disturbing, even traumatizing, to some students. If you ever feel the need to step outside during one of these discussions, either for a short time or for the rest of the class session, you may always do so without academic penalty. (You will, however, be responsible for any material you miss. If you do leave the room for a significant time, please make arrangements to get notes from another student or see me individually.)
If you ever wish to discuss your personal reactions to this material, either with the class or with me afterwards, I welcome such discussion as an appropriate part of our coursework.
That’s it. That’s my content warning. That’s all it is.
I should say as well that nothing in these two paragraphs represents a change in my teaching practice. I have always assumed that if a student steps out of the classroom they’ve got a good reason, and I don’t keep tabs on them when they do. If a student is made uncomfortable by something that happens in class, I’m always glad when they come talk to me about it — I’ve found we usually both learn something from such exchanges. And of course students are still responsible for mastering all the course material, just as they’ve always been.
So why the note, if everything in it reflects the rules of my classroom as they’ve always existed? Because, again, it’s my job as a professor to facilitate class discussion.
A few years ago one of my students came to talk to me after class, distraught. She was a student teacher in a New York City junior high school, working with a social studies teacher. The teacher was white, and almost all of his students were, like my student, black. That week, she said, one of the classes had arrived at the point in the semester given over to the discussion of slavery, and at the start of the class the teacher had gotten up, buried his nose in his notes, and started into the lecture without any introduction. The students were visibly upset by what they were hearing, but the teacher just kept going until the end of the period, at which point he finished the lecture, put down his papers, and sent them on to math class.
My student was appalled. She liked these kids, and she could see that they were hurting. They were angry, they were confused, and they had been given nothing to do with their emotions. She asked me for advice, and I had very little to offer, but I left our meeting thinking that it would have been better for the teacher to have skipped that material entirely than to have taught it the way he did.
History is often ugly. History is often troubling. History is often heartbreaking. As a professor, I have an obligation to my students to raise those difficult subjects, but I also have an obligation to raise them in a way that provokes a productive reckoning with the material.
And that reckoning can only take place if my students know that I understand that this material is not merely academic, that they are coming to it as whole people with a wide range of experiences, and that the journey we’re going on together may at times be painful.
It’s not coddling them to acknowledge that. In fact, it’s just the opposite.
Angus Johnston teaches history at Hostos Community College and is the proprietor of the website studentactivism.net.
The Western Interstate Commission of Higher Education (WICHE) today announced plans to spin off a learning-analytics project as a separate nonprofit group. The commission formed the Predictive Analytics Reporting Framework in 2011 as a collaboration between six online institutions, which shared data about student learning. Since then the work has broadened to include on-ground and competency-based institutions. WICHE said today that the data-services collaborative would become an independent organization by the end of the year.
There has been extensive hand-wringing about what can be done to help young graduates succeed in today’s tough labor market – especially in the spring, as high school seniors decide on their college offers, and college seniors prepare to graduate and face the world. Unemployment and underemployment rates among recent college graduates in the United States – largely a result of the recession’s lingering damage – are too high. And we’ve all seen the headlines questioning the value of college and the surveys that show employers bemoaning the “preparedness gap.”
But I am full of optimism.
As a university president, I spend far too much time among skilled, talented, motivated young people to be anything but hopeful about the future of higher education and the capabilities of the millennial generation – those born roughly between the early 1980s and the early 2000s. And honestly, surveys by my institution, Bentley University, of recruiters and students don’t reflect these headlines.
It’s perplexing. Is there such a disconnect to good jobs with this generation? And if there is one, let’s figure out how to resolve it instead of repeatedly touting the problem. So we chose to dig a little deeper and try to uncover the real issues. How do key stakeholders actually view the preparedness issue? And, more important, what will it take to ensure that millennials are fully prepared to succeed in the workplace?
We commissioned KRC Research to conduct a comprehensive preparedness survey of over 3,000 stakeholders, including employers, higher education leaders, students, parents, and recent college graduates. The survey found consensus in surprising places -- from rating recent graduates’ level of workforce preparedness to defining exactly what preparedness means.
One of the most interesting set of findings revealed that businesses are conflicted about the skills they want in their new employees and, consequently, are sending mixed messages to the marketplace. A majority of business decision-makers and corporate recruiters say that hard and soft skills are equally important for success in the workplace. (Hard skills are tangible ones, such as a student’s technical and professional skills, while soft skills include communicating well, teamwork and patience.)
Yet when asked to assess the importance of a comprehensive set of individual skills, business leaders put soft skills at the top of their list and industry and job-specific skills at the bottom; only 40 percent of employers say that the latter are important to workplace success. But while employers say soft skills are vital to long-term career success, they prefer to hire candidates with the industry-specific skills needed to hit the ground running, even if those candidates have less potential for future growth.
In the face of such conflicting information from employers, how should students and educators respond? Should they emphasize soft skills or hard skills?
The answer: This is a false choice. Students don’t need to – and shouldn’t have to – choose between hard and soft skills. It’s important for colleges to arm students with both skill sets -- whether a student is majoring in business or literature. By developing curriculums that fuse liberal arts and professional skills and by providing hands-on learning experiences, we can give our students the range of skills that are critical for the modern workplace.
This “fusion” was one of the popular solutions tested in the survey, and many schools are doing it already. Brandeis University, a private university with a liberal arts focus, says that its new undergraduate business program is already one of its most popular majors. (Brandeis points out that most of its business majors are double majors.) At West Virginia University, the College of Business and Economics and the School of Public Health have partnered to create a dual-degree program that will infuse business skills into the field of public health. At Georgetown’s McDonough School of Business, students in the freshman “Ethics of Entrepreneurship” seminar take on a semesterlong project designed to help them flex their critical thinking and writing muscles in a global and social framework.
Bentley has also adopted several strategies to ensure we are preparing our students for success. Virtually every student here majors or minors in business, while simultaneously pursuing a core of arts and sciences courses that focus on expanding and inspiring traditional “business” thinking. We recently expanded on our popular liberal studies major, an optional second major combined with a business major, by launching six-credit “fusion” courses co-taught by business and arts and sciences faculty. Combinations include a management course (Interpersonal Relations in Management) with an English course (Women and Film) to explore how women are perceived in film and how this can affect management styles; and a global studies course (U.S. Government and Politics) with an economics course (Macroeconomics) to teach how politics and economics work together and to demonstrate that understanding both is often essential to doing either one well.
All this study must be combined with hands-on, “experiential” learning – the pathway to hard skills. This is where business organizations can play an important role. Santander, the global, multinational bank, created a scholarship program to support academic, research, and technological projects – we are proud to be one of the 800 institutions in their program. Corporate partners can also help shape curriculums to teach skills as they are actually practiced in the workplace. EY LLP (formerly Ernst and Young) worked closely with us to merge accounting and finance for freshmen and sophomores, since those disciplines are inextricably linked in the business environment.
These strategies aim to equip students with both hard and soft skills and they can be adopted and adapted by many colleges. A challenge in higher education is that some academic models can be so discipline-specific that students miss out on cross-disciplinary opportunities to integrate their knowledge. But it doesn’t have to work this way.
Like other colleges and universities that are innovating and experimenting, we are seeing returns on this curricular investment. One way to measure this: our survey of the Class of 2013 shows that 98 percent of responding graduates are employed or attending graduate school full time (this includes information from 95 percent of the class). Retention, number and availability of internships and repayment of student debt are also key metrics.
I encourage my higher education colleagues to refocus their attention on the ways we can work together to strengthen our education models. Millennials, a group that includes our current students, are counting on us to prepare them for successful careers and life. And in the long run, it is an economic imperative that we do so.
Gloria Cordes Larson is president of Bentley University.
Students in Kerry Cronin's Boston College course on philosophy and ethics have an unusual way to earn extra credit: go on a date. The Boston Globe described how Cronin responded to student questions about the concept of dating but encouraging them to actually do so. She said most students seem comfortable going out in groups and having hook-ups, but not dating. To qualify for credit, the dates must focus on personal interaction. Dates must be 45-90 minutes, must be with a person of potential romantic interest, the invitation must be made in person (not electronically), and the date can't involve alcohol, kissing or sex.
In 1869, Charles W. Eliot, a professor at the Massachusetts Institute of Technology, wrote an essay in The Atlantic Monthly entitled “The New Education.” He began with a question on the mind of many American parents: “What can I do with my boy?” Parents who were able to afford the best available training and did not think their sons suited for the ministry of a learned profession, Eliot indicated, sought a practical education, suitable for business “or any other active calling”; they did not believe that the traditional course of study adopted by colleges and universities 50 years earlier was now relevant. Less than a year later, Eliot became president of Harvard. Among the reforms he initiated were an expansion of the undergraduate curriculum and substantial improvement in the quality and methods of instruction in the law school and the medical school.
The debate between advocates of traditional liberal learning and partisans of a more “useful” education, Michael Roth, the president of Wesleyan University, reminds us, has deep roots in American soil. In Beyond the University, (Yale University Press) he provides an elegant and informative survey of the work of important thinkers, including Benjamin Franklin, Thomas Jefferson, Ralph Waldo Emerson, W.E.B DuBois, Jane Addams, William James, John Dewey, and Richard Rorty, who, despite significant differences, embraced liberal education because it “fit so well with the pragmatic ethos that linked inquiry, innovation, and self-discovery.” At a time in which liberal learning is under assault, Roth draws on the authority of these heavyweights to argue that “it is more crucial than ever that we not abandon the humanistic frameworks of education in favor of narrow, technical forms of teaching intended to give quick, utilitarian results."
Most of Beyond the University is devoted to claims by iconic intellectuals about the practical virtues of liberal learning, which Roth endorses (with occasional qualifications). Exhibiting a “capacious and open-ended” understanding of educational “usefulness,” Roth indicates, Thomas Jefferson opted for free inquiry at his university in Charlottesville, Va., to equip citizens in the new republic to think for themselves and take responsibility for their actions. Ralph Waldo Emerson resisted education as mere job training; but, he indicated, it should impart knowledge to develop individuals willing and able to use what we now call “critical thinking” to challenge the status quo.
Acknowledging that different people need different kinds of educational opportunities, W.E.B. DuBois nonetheless insisted that the final product of training “must be neither a psychologist nor a brick mason, but a man.” Liberal learning, Jane Addams emphasized, inculcates “affectionate interpretation,” which prepares individuals not only to defend themselves against those with different points of view, but to empathize with others and act in concert with them. And John Dewey, the most influential philosopher of education in the 20 century, looked to a liberal education, according to Roth, to help students learn the lessons of experiment and experience, by trying things out and assessing the results, by themselves and with others, and, then, if appropriate, revising their behavior.
Roth’s approach – a reliance on the authority of seminal thinkers – is not without problems. As he knows, the nature of higher education – and its perceived roles and responsibilities – has changed dramatically since colleges focused on liberal learning. In 1910, only 9 percent of students received a high school diploma; few of them went on to college. These days, about 40 percent of young men and women get a postsecondary degree. Undergraduate, master’s, and doctoral degrees, moreover, are now required, far more than were in the days of Emerson and Eliot, for entry into the most prestigious, and high-paying, professions. Jamie Merisotis, president of the Lumina Foundation, is surely right when he asserts that “to deny that job skills development is one of the key purposes of higher education is increasingly untenable” – and that integration of specific skills into the curriculum can help graduates get work and perform their assigned tasks well.
Roth does not specify how liberal learning might “pull different skills together in project-oriented classes.” Nor does he adequately address “the new sort of criticism” directed at liberal learning. A liberal arts education, many critics now claim, does not really prepare students to love virtue, be good citizens, or recognize competence in any field. As Roth acknowledges, general education, distribution requirements, and free electives are not effective antidotes to specialization; they have failed to help establish common academic goals for students. And, perhaps most disturbingly, doubt has now been cast on the proposition that the liberal arts are the best, and perhaps the only, pathway to “critical thinking” (the disciplined practice of analyzing, synthesizing, applying, and evaluating information).
President Roth may well be right that liberal learning “will continue to be a fundamental part of higher education” if (and, he implies, only if) it rebalances critical thinking and practical exploration. The key question, it seems to me, is how to rebalance, while preserving the essence of liberal learning, at a time in which higher education in general and, most especially, the humanities are under a sustained attack by cost-conscious advocates of an increasingly narrow vocationalism, who are certain to be unpersuaded by the testimony of long-dead intellectuals. The task, moreover, is all the more daunting, moreover, because it will have to be carried out by proponents and practitioners of the liberal arts, many of whom, unlike Michael Roth, are now in despair, in denial, or have lost faith.
Glenn C. Altschuler is the Thomas and Dorothy Litwin Professor of American Studies at Cornell University.