This summer, the faculty of Shimer College held a discussion of Jacques Rancière's book The Ignorant Schoolmaster: Lessons in Intellectual Emancipation. In it, he discusses the educational theory and practice of Joseph Jacotot, who claimed that one could teach a subject one didn't even know in the first place. For Jacotot, teaching isn't a matter of expertise, but of determination. It isn't about transmitting knowledge to the student, but about holding students accountable to the material that they are working on.
Though the method Rancière described was more radical than anything we would actually try, the general approach resonates with what we try to do at Shimer, a small liberal arts college in the Great Books tradition. Our classes are all discussion-based, centered on important texts, artworks, and scientific experiments, and the professor serves not to instruct the students but to keep them on task and nudge them in the right direction. A handful of our faculty members have actually taught the whole curriculum, covering the humanities, social sciences, and natural sciences, and all are required to teach in at least two areas.
All of this remained mostly theoretical for me until this semester, however, when I began teaching Humanities 1: Art and Music. I am knowledgeable about fine arts and am a passable classical pianist, but my Ph.D. is in theology and philosophy and the course is the first I've ever taught where the primary object of study was something other than texts. The challenge of the course is to find a way of talking about art that is neither purely impressionistic and personal nor overly technical and scholarly.
The problem is most pronounced with music, where students often express a need for something called “music theory” that will permit them to talk about the experience of music in an intelligent and informed way. For the visual arts, there’s a more immediate intelligibility, given that the majority of the works we discuss in the course are representational (or at least suggestive of representation) — yet even there, students can feel that they don’t know what to say beyond assessing whether the painting represents what it’s supposed to in a way that is somehow “realistic.”
Our approach is to give students a handful of “hooks” that allow them to point out certain aspects of a given artwork. We begin with the format of the class, which is centered on Ovid's Metamorphoses, a work that has inspired artists for generations. As a result of this framing, the majority of works that we study are somehow representative or narrative in form, giving the students a basic orientation. The ready availability of different works on the same subject also gives us the
opportunity to highlight the differences between different media and the types of choices that artists make within the same medium.
On the level of form, we try to give students a ready familiarity with a few basic concepts. In music, the most important goals include being able to hear consonance vs. dissonance (which is fairly intuitive once it’s pointed out), knowing how to talk about melody and harmony, and being broadly familiar with the distinction of major vs. minor. With painting, we focus on the use of perspective, the interaction of colors, and the composition of the piece as a whole. These tools give them enough to begin thinking about how the expressive content of the artwork can reinforce, compliment, or complicate the emotional content of the narrative being portrayed.
By halfway through the semester, I had developed a certain level of confidence on art and music. Yet the syllabus threw me a curve ball when I was required to introduce a new art form by taking the students on an architectural tour in downtown Chicago. Here I felt my ignorance much more acutely, and instead of trying to create my own tour, I asked a senior faculty member to demonstrate the tour he had given the previous year, which I simply repeated.
We had only a short time for the tour, and so I could only point out a handful of extremely basic points. I showed them a few buildings that were built before the skyscraper technique was developed (basically, the outer walls had to be load-bearing before the skyscraper technique allowed for an internal distribution of the weight) as well as some early skyscrapers. I talked about the ways that the architect can get us to “read” a building — how the eye is drawn upward, how a building can be “capped” with a different design on the top floors, how the base of the building can provide indications of where the entrances are and how the facade can reinforce that. We saw some buildings that were highly ornamented and some that were very stark. We also looked at lobbies in a similar variety of styles. Finally, I tried to point out to them the way buildings interact with each other.
None of this was very advanced, and indeed, I was most often simply pointing out to the students what my colleague had pointed out to me on our tour. Yet the students reported that they had benefited from simply being told to step back and actually look at the buildings and from being given certain rough-and-ready indications of what to look for. Some reported they had never really thought about architecture at all, that it had always faded into the background. Even a more knowledgeable student said that being asked to look at buildings in the context of the cityscape rather than in isolation was a step beyond what he’d done before.
None of this resulted from any special skill I brought to the table — even the mechanical execution of the tour was pretty inept, and I’m known to mumble. (My students strongly discouraged me from pursuing a career as a tour guide.) It was simply a matter of being told to look and being given a few specific things to look at. It made them want to look more closely in the future, as indeed preparing for the tour made me want to look more closely as well.
While it was most pronounced with architecture, I've been learning along with the students throughout the semester. During trips to the Art Institute of Chicago, I've found that my way of looking at paintings has changed. A recent visit to the symphony with my students revealed that I'm getting better at following and thinking about classical music — after the concert, I found that I really wanted to talk about it and even investigate it further, in a way that wouldn’t have been true before. I see similar progress in my students, as they become more and more comfortable with talking about the formal elements of the artworks and relate them in more sophisticated way to their representational or narrative content. In fact, one of my students who transferred from a local art school claims that she has had more and better discussion of art in our class than she did in art school.
At this point, my reader may be skeptical. Perhaps I am giving students an adequate introduction to the fine arts, making up for my ignorance with my enthusiasm — but wouldn't they be better off with a more knowledgeable professor? In some ways, I'm sure they would. Yet I would turn the tables and point out the disadvantages of having an accomplished expert teach an introductory course. Too often, such classes consist in the delivery of scholarly knowledge that only serves to exacerbate the distance that the students feel from the material itself. Instead of learning how to look at an artwork or listen to a piece of music, students learn how to categorize them: this is early Renaissance, this is Impressionist....
The two skills don't have to be mutually exclusive, but on a practical level, they most often are — and I would rather that my students begin by gaining the confidence to analyze and respond to a work and only then delve into the historical and scholarly background according to their interest. We live in a time where there's no shortage of access to facts, but college may be their one chance to develop a real understanding of how art and music work. From that perspective, my inability to supply “the right answer” or to indulge my students' curiosity about historical trivia that distracts our attention from the work before us counts as a positive advantage.
This isn't to say, of course, that I must never teach in my own area of expertise. Indeed, my experience as an “ignorant schoolmaster” has already changed the way I think about teaching things within my comfort zone as well. It has pushed me to think more about holding students accountable for the ways they reach their own answers than about how best to give them — or Socratically help them stumble upon — the “right answer.” Even in classes where I bring much more to the table, the focus is and must be the material we're working on together, not all the information I'm bringing from the outside. More than that, though, all that information must be put to the test of the material itself, so that I always have to be open to the possibility that the interpretation I brought to the table is wrong, or at least not the whole story.
The approach I'm describing here goes against many of the deeply engrained habits that academics develop in graduate school and carry over into their teaching. While Rancière and others would cast moral aspersions on the expertise-centered approach to education, I view it more as a failure of imagination. Robert Hutchins, the University of Chicago president whose approach forms the basis for Shimer's curriculum, once said that liberal arts colleges tend to imitate graduate programs because at least graduate programs have a clear idea of what they're doing — namely, producing experts. An undergraduate education, however, neither can nor should achieve that goal. The liberal arts approach in particular provides a unique opportunity to form broad-minded critical and creative thinkers who have the right combination of intellectual boldness and intellectual humility to enter a wide variety of professions and explore many bodies of knowledge. A crucial part of that formation is learning to have the courage to admit one's own ignorance, and I believe students would be better served if faculty members were more commonly called upon to display that same courage.
America's public research universities face a challenging economic environment characterized by rising operating costs and dwindling state resources. In response, institutions across the country have looked toward the corporate sector for cost-cutting models. The hope is that implementing these “real-world” strategies will centralize redundant tasks (allowing some to be eliminated), stimulate greater efficiency, and ensure long-term fiscal solvency.
Recent events at the University of Michigan (suggest that faculty should be proactive in the face of such “corporatization” schemes, which typically are packaged and presented as necessary and consistent with a commitment to continued excellence. The wholesale application of such strategies can upend core academic values of transparency, and shared governance, and strike at the heart of workplace equity.
Early this month our university administration rolled out the “Workforce Transition” phase of its “Administrative Services Transformation” (AST) plan. From far on high, with virtually no faculty leadership input, 50 to 100 staff members in the College of Literature, Science, and the Arts (LS&A) departments were informed that their positions in HR and finances (out of an anticipated total of 325) would be eliminated by early 2014. Outside consultants, none of whom actually visited individual departments for any serious length of time, reduced these positions to what they imagined as their “basic” functions: transactional accounting and personnel paperwork.
It became clear that many of those impacted constitute a specific demographic: women, generally over 40 years of age, many of whom have served for multiple decades in low- to mid-level jobs without moving up the ranks. A university previously committed to gender equity placed the burden of job cuts on the backs of loyal and proven female employees.
These laid-off employees found little comfort in learning that they would be free to apply for one of 275 new positions in HR or finance that will be contained at an off-campus “shared services” center disconnected from the intellectually vital campus life.
The resulting plan reveals no awareness of how departments function on an everyday basis. Such “shared services” models start with the presumption that every staff member is interchangeable and every department’s needs are the same. They frame departments as “customers” of centralized services, perpetuating the illusion that the university can and should function like a market. This premise devalues the local knowledge and organic interactions that make our units thrive. Indeed, it dismisses any attribute that cannot be quantitatively measured or “benchmarked.” Faculty members who reject these models quickly become characterized as “change resisters”: backward, tradition-bound, and incapable of comprehending budgetary complexities.
The absence of consultation with regard to the plan is particularly galling given that academic departments previously have worked well with the administration to keep the university in the black. Faculty members are keenly aware of our institution’s fiscal challenges and accordingly have put in place cost-cutting and consolidating measures at the micro level for the greater good.
Worries about departmental discontentment with AST and shared services resulted in increasing secrecy around the planned layoffs. In an unprecedented move, department chairs and administrators were sworn to silence by “gag orders” prohibiting them from discussing the shared services plan even with each other. Perturbed, close to 20 department chairs wrote a joint letter to top university executives expressing their dismay. As one department chair said, "The staff don't know if they can trust the faculty, the faculty don't know if they trust the administration.”
Within a few days, at least five LS&A departments had written collective letters of protest, signed by hundreds of faculty members and graduate students. Over the past few weeks, that chorus of opposition has only intensified as faculty members from all corners of our campus have challenged AST. Some have called for a one- to two-year moratorium and others for an outright suspension of the program.
The outcry against the planned transition itself reflects the growing rift between departmental units and the central administration at the University of Michigan. Championed as an astute financial fix by a cadre hidden away in the upper-level bureaucracy, the shared-services model is the brainchild of Accenture, an outside consulting firm which our university has also contracted for a multimillion-dollar IT rationalization project.
Caught off-guard by the strong pushback, the administration has issued several messages admitting that their communication strategies around these changes were inadequate, stating that for now layoffs will be avoided, and assuring us that there will be greater consultation and transparency going forward.
While these definitely are hopeful signs, important questions about institutional priorities and accountability have arisen.
Initially, the university’s consultants claimed that AST would render a savings of $17 million. Over time that figure shrunk to $5 million, and by some accounts now is reputed to be as low as $2 million. Yet the university has already reportedly spent at least $3 million on this effort with even more spending on the horizon.
Where are the cost savings? How much more will the university spend on Accenture and other outside consultants? How will replacing or shifting valued employees, even at lower numbers and salaries, from their departmental homes to what essentially is a glorified offsite “call center” actually enhance efficiency? How can a university ostensibly committed to gender equity justify making long-serving and superb female employees pay the price of AST? What credible proof is there that centralized management will provide any budgetary or administrative benefits to the specialized needs of individual departments?
The implications of these questions are thrown into starker relief when considering that almost to the day of the announced layoffs, the university launched its most ambitious capital campaign, “Victors for Michigan,” with festivities costing more than $750,000 and a goal of raising $4 billion.
Whether or not the collective protest initiated by a critical mass of faculty will result in change or reversal remains to be seen. Nevertheless, the past few weeks have been a wake-up call. Faculty must educate themselves about the basic fiscal operations of the institution in these changing times and reassert their leadership. Gardens, after all, require frequent tending.
Otherwise, we remain vulnerable to opportunistic management consultants seeking to use fiscal crisis as a source of profit. Public institutions that remain under the spell of misleading corporate promises will ultimately save little and lose a great deal.
Anthony Mora is associate professor of American culture and history at the University of Michigan. Alexandra Minna Stern is professor of American culture and history, and a professor of obstetrics and gynecology at the University of Michigan.
For some reason I have become aware that it is possible to take photographs of bass guitar players in mid-performance and, by digital means, to replace their instruments with dogs, so that it then appears the musicians (who very often wear a facial expressions suggesting rapture or deep concentration) are tickling the dogs. Yes, yes it is.
I am not proud of this knowledge and did not seek it out, and would have forgotten about it almost immediately if not for something else occupying my attention in the past few days: a couple of new books treating the phenomenon with great and methodical seriousness. Not, of course, the dog-tickling bass player phenomenon as such, but rather, the kind of online artifact indicated by the titles of Karine Nahon and Jeff Hemsley’s Going Viral (Polity) and Limor Shifman’s Memes in Digital Culture (MIT Press).
The authors differentiate between the topics of the two volumes. Despite a common tendency to equate them, memes don’t always “go viral.” Things that do (say, video shot during a typhoon, uploaded while the disaster is still under way) are not always memes. The distinction will be clarified shortly -- and there is indeed some value in defining the contrast. It corresponds to different kinds of behavior or, if you prefer, different ways of mediating social and cultural life by means of our all-but-inescapable digital device.
Still, the line should be drawn only just so sharply. It seems bright and clear when the authors bring their different methods (one more quantitative than qualitative and vice versa) to the job. I don’t mean that the difference between viral and memetic communication is simply one of perspective. It seems to exist in real life. But so does their tendency to blur.
“Virality,” write Nahon and Hemsley in a definition unlikely to be improved upon, “is a social information flow process where many people simultaneously forward a specific information item, over a short period of time, within their social networks, and where the message spreads beyond their own (social) networks to different, often distant networks, resulting in a sharp acceleration in the number of people who are exposed to the message.” (Nahon is an associate professor, and Hemsley a Ph.D. candidate, at the Information School of the University of Washington.
Here the term “information item” is used very broadly, to cover just about any packet of bytes: texts, photographs, video, sound files, etc. It also includes links taking you to such material. But unlike a computer virus -- an unwanted, often destructive such packet – a message that has “gone viral” doesn’t just forward itself. It propagates through numerous, dispersed, and repeated decisions to pay attention to something and then circulate it.
The process has a shape. Charting on a graph the number of times a message is forwarded over time, we find that the curve for a news item appearing at a site with a great deal of traffic (or a movie trailer advertised on a number of sites) shoots up at high speed, then falls just about as rapidly. The arc is rapid and smooth.
By contrast, the curve for an item going viral is a bit more drawn-out -- and a lot rougher. It may show little or no motion for a while before starting to trend upwards for a while (possibly followed by a plateau or downturn or two) until reaching a certain point at which the acceleration becomes extremely sharp, heading to a peak, whereupon the number of forwards begins to fall off, more or less rapidly -- with an occasional bounce upwards perhaps, but nothing so dramatic as before.
So the prominently featured news item or blockbuster ad campaign on YouTube shoots straight up, like a model rocket on a windless day, until the fuel (newsworthiness, dollars) runs out, whereupon it stops, then begins to accelerate in the opposite direction. But when something goes viral, more vectors are involved. It circulates within and between clusters of people -- individuals with strong mutual connections with each other. It circulates through the networks, formal or informal, in which those clusters are embedded.
And from there, onward and outward – whether with a push (when somebody with a million Twitter followers takes notice), or a pull (it begins to rank among top search-engine results on a certain topic), or both. The authors itemize factors in play in decisions about whether or not to share something: salience, emotional response, congruence with the person’s values, etc. And their definition of virality as “a social information flow process” takes into account both the horizontal dimension of exchange (material circulating spontaneously among people familiar with one another) and the roles of filtering and broadcasting exercised by individuals and online venues with a lot of social capital.
None of which makes virality something that can be planned, however. “Content that we create can remain stubbornly obscure even when we apply our best efforts to promote it,” they write. “It can also grow and spread with an apparent life and momentum of its own, destroying some people’s lives and bringing fame and fortune to others, sometimes in a matter of days.”
An Internet meme, as Limor Shifman sums things up, is “(a) a group of digital items sharing common characteristics of content, form, and/or stance; (b) that were created with awareness of each other; and (c) were circulated, imitated, and/or transformed via the Internet by many users.”
As with virality, the concept rests on a biological metaphor. Coined by Richard Dawkins in 1976, “meme” began in a quasi-scientific effort to identify the gene-like elements of behavior, cultural patterns, and belief systems that caused them to persist, expand, and reproduce themselves over very long periods of time. As reincarnated within cyberculture, the meme is a thing of slighter consequence: a matter of endless variation on extremely tenacious inside jokes, occupying and replicating within the brains of bored people in offices.
Shifman's point that memetic communication (which for the most part involves mimicry of existing digital artifacts with parodic intent and/or "remixing" them with new content) is an exemplary case of Web 2.0 culture seems to me sound, which probably also explains why much in the book may seem familiar even to someone not up on LOLcats studies. Yes, memes are a form of active participation in digital communication. Yes, they can carry content that (whether the meme goes viral or not) questions or challenges existing power structures. I have seen my share of Downfall parody videos, and am glad to know that Bruno Gantz is okay with the whole thing. But every so often that line from Thoreau comes to mind -- "as if we could kill time without injuring eternity" -- and it seems like a good idea to go off the grid for a while.
Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78
The liberal arts are dead, or — at best — dying. That's the theme of story after story in today’s news media.
Professional skills training is in. The STEM (science, technology, engineering, and math) fields are in. Practical, vocational higher education is in. The liberal arts are out, relics of a “traditional” way of thinking that has been overtaken by the pressing demands of our dizzyingly complex digital age.
As new students arrived on college campuses this fall, the message many of them heard is that majoring in history, or English, or anthropology is a surefire recipe for a life of irrelevance and poor job prospects. These “conventional” disciplines cannot possibly train students for productive, enriching careers in the high-tech information age whose future is now.
Although this viewpoint is rapidly gaining the status of settled wisdom, it is tragically misguided. It is based on a false dichotomy, namely that the liberal arts and the more vocational, preprofessional, practical disciplines — like, say, computer science — are fundamentally different and opposed. But this misunderstands both the age we’re living in and the challenges we face, not to mention one of the most significant trends in higher education over the last few decades — the evolution of interdisciplinarity.
In essence, this whole debate comes down to skills. The liberal arts are often said by critics to provide little that is of “practical value” in the “real world.” In reality, though, liberal arts curriculums can and do give students skills that are just as professionally useful as those in more “relevant” occupationally specific fields of study.
At my university, the University of Maryland-Baltimore County, students this fall can declare a new major called global studies, which integrates courses in 12 liberal arts departments — including economics, geography and environmental systems, history, media and communication studies, and political science — into a rigorous interdisciplinary curriculum. Majors are required to study abroad and to achieve fluency in at least one foreign language. By graduation, they will have demonstrated their research, analytical, critical-thinking, and writing skills in a substantial, “capstone” research project. Our students will also do internships with companies, not-for-profits, and government agencies.
Equally important, they will develop “global competence,” which employers in many professions have identified as one of the most desirable, but grossly lacking, sets of skills required of their new employees. Broadly defined, global competence is “the capacity and disposition to understand and act on issues of global significance.” Its central elements include knowledge of world affairs — cultural, economic, and political; proficiency in communicating with people in and from other societies, both verbally and in writing; the ability to appreciate multiple perspectives and respect cultural diversity; and the intellectual and psychological flexibility to adapt to unfamiliar and rapidly changing circumstances.
Developing the skills that we hope to instill in UMBC’s global studies majors is an inherently interdisciplinary mission. In a recent New York Timescolumn, Yale professor Nicholas Christakis argues that the social sciences (a subset of the liberal arts) badly trail the natural sciences in generating innovative “institutional structures” that can produce the kind of cutting-edge science necessary for solving some of the world’s most intractable — often intrinsically interdisciplinary — problems. However, he also notes that this is beginning to change, for example, in the form of a new global affairs major at Yale.
Whether it’s global studies at UMBC or global affairs at Yale, these exciting new programs tangibly articulate why talking about liberal arts education versus practical training creates the false perception that these two enterprises are essentially at odds. At UMBC, it's the combination of interdisciplinary liberal arts education; substantial research, writing and analysis; rigorous foreign language training; study abroad; and experiential learning in the form of internships and other applied opportunities that will give students the skills they will need to thrive and “do good” in the 21st century.
The tragedy is that we might blow it. If we continue to present students with a false choice between the liberal arts and “real-world” vocational training, we will produce what social scientists like to call “suboptimal” outcomes. Too many talented, energetic, hard-working students will choose “safe” educational and career paths, and too many truly global problems will go unsolved.
Devin T. Hagerty is a professor of political science and director of global studies at the University of Maryland-Baltimore County.