In the 1966-67 TV series “It’s About Time,” two astronauts travel back in time and visit with some cave people (including the 20th-century character actress Imogene Coca), and then some of the cave people travel forward in time to the 20th century. Both the astronauts and the cave people learn things in usual, and unusual, places and times. Similarly, there is increasing recognition in higher education that students today can learn anywhere, any time. So whether or not you are old enough to remember this sitcom, you may be aware that learning in today’s institutions of higher education is becoming more independent of the dimensions of time and space.
Where, when, and how learning takes place in higher education is going through significant changes. These changes are for the great benefit of students, but they also have consequences for our institutions of higher education, consequences that reach far beyond the classroom itself. These consequences reach into, not just the area of technology, but the areas of governance, labor practice, and policy.
First, higher education’s increasing independence from time and space is coupled with an increasing emphasis on learning outcomes — that it is what you learn, not where or when you learn it — that matters. You could learn while you are a matriculated student in a Tuesday/ Thursday 10-11:20 a.m. course in your local community college, or you could learn on occasional Saturday afternoons while you are sitting on the beach. The important thing is that you learn. Further, if what is important is the actual learning outcomes, it is essential to be able to tell whether those outcomes have been achieved.
A focus on outcomes is not new in higher education. Nevertheless, American higher education has historically focused largely on the mechanisms that produce outcomes, and not so much on the outcomes themselves. Thus state, regional, and professional accreditors require colleges and universities to demonstrate that they have qualified faculty, appropriate syllabi, and the proper numbers of class hours — all examples of assumed inputs to good outcomes — rather than requiring the institutions to demonstrate that their students know what they should in order to receive their degrees. The most commonly mentioned outcome is graduation, as part of what is known as the college completion agenda.
But, unfortunately graduation does not necessarily mean that a student has learned anything. For graduation to be meaningful it must represent measurable, verifiable achievement of specific learning outcomes, a goal toward which many organizations and institutions are working. For example, according to the Lumina Foundation’s work on the Degree Qualifications Profile, a baccalaureate student in a certain field "defines and explains the boundaries and major sub-fields, styles, and/or practices of that field."
A relatively recent example of more outcome-based focuses is the LEAP initiative of Association of American Colleges and Universities. LEAP (Liberal Education and American’s Promise) specifies a set of "essential learning outcomes” that every 21st-century college graduate should achieve. For example, “…continuing at successively higher levels across their college studies, students should prepare for twenty-first century challenges by gaining… intellectual and practical skills, including... critical and creative thinking... practiced extensively, across the curriculum, in the context of progressively more challenging problems, projects, and standards for performance.”
For many years the City University of New York has worked diligently to turn its focus to learning outputs rather than inputs. One example is CUNY’s Performance Management Process, which began in 1999, and which encourages CUNY colleges to set outcomes goals such as “increase licensing examination pass rates” as opposed to “increase the number of classroom instruction hours for students preparing to take the licensing examination.” The latter may be a perfectly valid activity, but it is not a learning outcome.
A relatively recent example of CUNY’s focus on learning outcomes concerns CUNY’s Pathways initiative, approved by the CUNY Board of Trustees and then-Chancellor Matthew Goldstein in 2011. Designed to smooth transfer for CUNY students, Pathways includes a framework for general education that applies to all 19 undergraduate colleges of CUNY. This framework is defined, not in terms of particular courses that students must take (which would be inputs), but in terms of learning outcomes.
Now that the Pathways project has been essentially completed, with its courses first offered in fall 2013, CUNY students are supposed to achieve certain learning outcomes by virtue of taking the Pathways courses. However, each individual CUNY college determines which specific courses will be offered to achieve these learning outcomes. Thus a student at one college might take a course entitled “Contemporary Asia” to satisfy the world cultures learning outcomes of Pathways, and a student at another college might take a course entitled “World History to 1500.” In both cases, the expectation is that on completion of the course, the student can, for example, “analyze culture, globalization, or global cultural diversity, and describe an event or process from more than one point of view.” Yet, though CUNY has decided to focus on learning outcomes for its general education courses, how to measure those outcomes is still under discussion. Assessing learning outcomes can be much more difficult than checking off inputs, such as the amount of time students spend sitting in class (see this link for an example of how the AAC&U suggests that the learning outcome of critical thinking can be assessed).
CUNY’s focus on learning outcomes for the Pathways general education curriculum directly promotes space-independent learning in that students can take Pathways courses anywhere at CUNY and then receive credit anywhere at CUNY. However, although Pathways enables students to complete their degrees more efficiently, some CUNY faculty have stated that Pathways interferes with the faculty having complete control over the curriculum and decreases educational quality. The CUNY administration has countered that New York State Education Law gives control over the curriculum to the CUNY Board of Trustees, and that the actual curricular content of Pathways has in any case been created and approved by the faculty. The conflict has reached the courts.
More generally, assuming that we can measure outcomes, the focus on outcomes rather than on inputs brings us back to the independence of learning from time and place that is increasingly characterizing higher education. For it is this very focus on outcomes that validates, for example, assessment of prior learning (including learning done outside of a college or university) as a method for determining whether someone has sufficient learning to be certified as having completed a particular course or degree. The American Council on Education standards regarding how to evaluate learning achieved as a result of military service are a good example of how to standardize the assessment of learning outcomes. For example, ACE recommends that people who have served as Coast Guard copilots receive a total of 40 college semester-hour credits in topics such as aviation meteorology, flight physiology, and crew resource management.
Some universities have gone further, explicitly and actively seeking ways to help students proceed very efficiently in accumulating certified achievements. These universities may use traditional classroom-based study only when necessary (see Western Governor’s University’s competency-based education, Southern New Hampshire University’s competency-based general studies degree, and Northern Arizona University’s new competency-based education). Such approaches constitute a key component of President Obama’s recent college cost control plan.
Also in the recent news is higher education’s use of MOOCs (massive open online courses). Because these courses are free and open to everyone, students can take them at their own convenience, and then, by means of some official assessment, students can be certified as having learned a particular topic and, ultimately, be awarded a degree. However, because of MOOCs’ low completion rates, there are currently many attempts to modify MOOCs (so that they are no longer massive or open) to try to facilitate MOOC students’ completion rates (and thus achievement of the MOOCs’ learning outcomes). For example, Georgia Institute of Technology is structuring a master’s program so that it exposes students to MOOCs that are open and online for all (with the MOOCs serving the traditional roles of lectures and textbooks), but that also provides the Georgia Tech students with individualized tutoring and proctored exams, such that the Georgia Tech students do not experience traditional MOOCs.
Another example of higher education’s increasing focus on outcomes as opposed to space and time involves the unbundling of course learning. In such cases students are first assessed to see what they already know, and then they are instructed only on what they do not know. This is not a new approach, but the current focus on learning outcomes highlights this technique’s usefulness.
The unbundling of learning such that students only need to spend time learning what they already know, assessment of prior learning, and online (particularly asynchronous) learning are all examples of the same goal: Freeing learning from time and space constraints and focusing on learning outcomes.
All of these trends should help students finish their degrees, and finish them faster. However, this degree acceleration will only occur if colleges and universities provide the conditions needed to facilitate this kind of work — enabling students to achieve the desired learning outcomes by whatever path is easiest for them. To do this the college and university infrastructure that supports learning should not be tied to particular times or places. Students should have opportunities to access what they need at any time and in any place — with or without an instructor, and with or without an adviser in physical proximity. Learning tools should be available that are easy to use in different places (i.e., they should be portable and accessible no matter what someone’s abilities), and those learning tools should function in the same way no matter what time it is (i.e., they should be asynchronous).
There should also be easily available — at any time and in any place — other sorts of supports such as reference material, technology support, tutoring, counseling, colleagues with whom to discuss the material, cocurricular activities, etc. Finally, of course, there must be good assessments of the outcomes and everything must be affordable, not only for the students and their families, but also for the institution and, in the case of public institutions, for taxpayers. Not an easy list to accomplish, by any means, but ongoing technological developments are facilitating institutions of higher education providing all of these types of physical and virtual infrastructure supports.
Providing all of these student supports is not the only change occurring in colleges and universities as a result of the increasing emphasis on learning outcomes and the concomitant decreasing emphasis on time- and space-dependent learning. How an instructor best structures the learning experiences is also changing. Lessons should be different for students engaging in learning activities for 30 minutes whenever their children are napping or have gone to sleep for the night than for students attending a traditional lecture course every Monday, Wednesday, and Friday at 9 a.m. Lessons may need to be broken into smaller units with the material presented in a variety of ways, accompanied by optional multiple examples, and with continuous opportunities for learning assessment and feedback. The Khan Academy incorporates just these sorts of elements, which are possible contributors to its apparent significant success.
In other words, if the goal is truly to focus on learning outcomes, and to facilitate reaching those outcomes by whatever mechanisms work best for each individual, then it is necessary to enable all aspects of the learning process to operate independently of time and place. The lessons, assignments, and infrastructure supports for at least some students must be different than they were in the past.
Increasingly, federal, regional, and state regulations are permitting institutions to provide and certify learning that is place- and time-independent. For many years, much of the higher education system has been built around the concept of the credit hour (a similar concept is the Carnegie unit): students in class for a certain amount of time receive a credit. Credits earned by students count toward their degrees and are also the basis for how much tuition is charged, how much financial aid can be obtained, and how much workload credit a faculty member receives. As learning becomes independent of time and place, these uses of a credit hour become difficult, impossible, or irrelevant. Recognizing these consequences, in March 2013, the federal Department of Education issued a letter stating that it will consider giving financial aid based on how much a student has learned, rather than based on time in class. The Carnegie Foundation for the Advancement of Teaching, the originator of the Carnegie unit in 1906, is also exploring ways of students being certified for having learned certain material without the intervening use of Carnegie units.
At the same time, hardware manufacturers and software developers are working intensively to make their devices and applications useful in supporting time- and space-independent learning. All of these developments will enable students to progress in ways that recognize the needs and accomplishments of each individual.
Nevertheless, colleges and universities themselves need to do more than simply measure outcomes and provide infrastructure support. Time- and space-independent learning requires changes in the very foundations of these institutions. If not by number of traditional credit hours taught, how will faculty workload be defined? Will faculty agree that prior learning that is achieved by means other than direct instruction can be just as valid as direct exposure (and often extended direct exposure) to themselves? Put another way, will faculty endorse the view that what is important for students is that they learn widely accepted correct information, obtained by whatever means best enables student progress, rather than information that each faculty member him- or herself specifically provides? Or will most faculty react to the changes in the learning process as did some CUNY faculty regarding the Pathways initiative? Or as did the philosophy faculty at San José State University when their campus administration suggested that their department make use of a MOOC produced outside of San José State University: “There is no pedagogical problem in our department that [this MOOC] solves, nor do we have a shortage of faculty capable of teaching the relevant course.” Colleges and universities may have to change many of their labor and governance policies in order to best facilitate the sort of learning described in this piece.
Thus the focus on outcomes, as well as the independence of learning from time and space constraints, is indeed about moving us along on the college completion agenda. However, this work has many additional implications for higher education and its associated industries. Just as in Heinlein’s Stranger in a Strange Land, higher education may involve lots of seemingly (at least at first) strange people doing strange things in strange places — a complex and difficult journey, but one with highly desirable outcomes.
Alexandra W. Logue is executive vice chancellor and provost of the City University of New York.
How many weeks does it take to get over a bad semester? Is it like the end of a bad relationship? Too painful to talk about except with one’s closest friend… or with a complete stranger?
You are my stranger.
Kari (I should use her real name but I won’t) had trouble with writing grammatically; she had trouble with phonetics, even. When she typed, she couldn’t see the difference between building and blinding.
When she read her work aloud, she didn’t seem to understand what she herself was saying. As far as she let on, in our office conferences (that I called her in for), she was not in need of our college’s excellent Access-Ability program, though I think she was and I tried to suggest it could be useful for her to find out.
She had never had problems with her writing before, she said. She had always earned Bs in English.
Why did she want to be a journalist, I asked. Yes, she was a journalism student in the last journalism course I will ever teach.
I’m hemming and hawing, because I don’t want to get to the story.
The story is that I was going to fail Kari, even though she tried. I was going to fail Kari because she was a journalism major and she did not have a grasp of writing in English. She had lived in America for 10 years and she was 21. In my experience with first-generation students, those who arrive before age 16 adapt to English very quickly. Kari didn’t have much of an accent; she had grown up in… let’s call it Asia. She lived in Queens -- with two other large families in one house. Her family had the basement apartment, and in the summer all three families liked to hang out in it because it was cooler there. She wrote a “personal” piece about that living arrangement. It was not as clear as I’ve summarized it. I thought that the students should hear and read aloud their own articles; some were terrific, some were bad. When Kari read her piece aloud, she continually stumbled and had long squinting pauses wherein she seemed to be trying to decipher hieroglyphics.
She was, on the other hand, as she claimed, a good listener -- to her fellow students and to guests. She asked visiting writers O.K. questions. The children with learning disabilities that I used to work with had clear and vibrant strengths that sometimes masked their dysfunctions. But in a journalism class, Kari’s dysfunctions were loud and clear almost every day.
I was going to fail her. To my shame or pride (your choice) I had never flunked a student who tried and did all the work. Kari tried and she did most of the work, much more than many of her could-be competent classmates. I appreciated that she attended regularly and was polite and pleasant. I liked her; but she was illiterate. She was not illiterate in that mean way we say when we talk about our distracted students; she was functionally illiterate in the way that someone can be legally blind; she had various perception gaps and fogginesses.
So I was going to flunk Kari, but then, toward the end of the semester, I went to one of my old standby assignments, the observation of a public space. I write down simple emphatic instructions and hand them out; I read the instructions aloud and ask for questions. They say, “I get it, I get it!” And usually they do.
Each student plants herself in a spot that is public to any member of the college community. She only has the class hour to go find the spot and sit herself there and start writing down everything and anything she hears, sees, smells. Objective. She isn’t to stop writing. Usually students like this; they realize they are taking in so many details they usually overlook. They catch actual language. Their hands get tired. I love the assignment. We did that and the next day reviewed our work. Kari’s illiteracy did not get in the way of her ability to accumulate details. It was a fine observation really. It was ungrammatical, but I understood it. Her language seemed at the level of some of my weaker developmental reading and writing students.
A week later, I had them repeat the assignment.
The next meeting I asked them to write about and then talk about the differences between their two observations. One of the students mentioned feeling self-conscious, because someone came up to her and asked her what she was doing. She took down the whole conversation, including, “Hey! You’re writing down what I’m saying!”
Another student, whose dial was always set on “Complaint,” said she felt creepy watching people. I pointed out again that there are cameras everywhere we go; we are continually photographed, filmed and electronically identified by nameless organizations; whereas in this modest assignment, we are only individuals looking at other people.
“That’s stalking, pofessa!”
“When did people-watching become stalking?”
Kari raised her hand. “Professor, I got looked at too.”
“I was sitting there at the Starbucks and I was writing… and this psychology professor -- I heard someone call to her and say, ‘You’re my psychology professor,’ and that’s how I knew that detail, professor -- ”
“And she kept walking around me and trying to look at my paper. I didn’t like that, but I didn’t do nothing and I just kept writing like you said and she kept coming over and then she went to the doors near the fishes, the fish tanks, and made a phone call on her cell phone -- I wrote that down -- and like a few minutes later a police car came up to the outside doors and the security guys got out and she went to them, and I saw her point at me.”
“And they came by and said, ‘Hey, what are you doing?’ and they made me give them my purse and my notes.”
“Oh, my god!”
“See, Professor!” said Complainer. “You got one of us arrested!”
“Then what happened?”
Kari told us they told her to follow them to the security office and then she got interrogated there about why she was writing about the arrangement of chairs and how many people were in line at the coffee stand and who told her to do this? …
“You told them who! You showed them my instructions, right?”
“I wanted to see what would happen.”
I realized at that moment: So, I’m not going to flunk Kari.
I was astounded by her description of the actions of the campus police and I was delighted with Kari. She was after all a journalist! She couldn’t write, of course, but she was a journalist at heart. I would pass her.
“So what happened?”
She explained to them she was just writing, that it wasn’t their or anybody else’s business.
“You’re making people really paranoid,” the interrogating officer told her.
And they let her go.
“That’s incredible!” I said. I regretted the assignment now, even though it had brought out Kari’s latent journalistic skills; arrested! That was unbelievable! It was a violation of civil rights! Of freedom of speech!
The students were taunting me, “See, see! Your assignments be getting us in trouble!”
I tried to justify it again. “But nothing really happened, finally -- except I’m going to go see the security people.”
“No,” said Kari. “That’s O.K. I still want to see what happens. You can just give me a note that I’m a student in your class so I can get my purse and my ID card back.”
It was just a few weeks after the Boston Marathon tragedy and the officers at our entry-gates were being careful about identification checks again. She needed her card.
I wrote her the note and said, “You sure you don’t want me to go with you to Security? I want to go. I’m really upset about this.”
“No, no. I can handle it. It’s my story, right?” She flashed her big eyes at me and nodded, begging me.
But I was uneasy.
Later that afternoon I went to a meeting and mentioned to a colleague what happened to Kari and she made me repeat the details about the psychology professor ratting out the student; she said the arrest was outrageous and I agreed. “What are you going to do, Bob?”
“I don’t know yet.” Why did I hesitate? Why didn’t I march over to the security office?
After the meeting, I went to my department mailbox and Kari had left her notes from the observation as well as the last draft of her last article.
What was I going to do?
On the subway home I read her observation notes and got even more outraged at the security officers -- and even more unsettled with myself for not having already confronted them about it. I started reading her article. It was not her article. Every sentence was grammatical. It seemed to be not one but two professional articles stitched together (which I discovered later it was).
All right, she’d get an F.
But meanwhile, her rights had been violated. I couldn’t let that go.
The next day at school, when I asked Kari to see me at the end of class, she came up and I told her I was about to go to the security office. She asked me to please not to; it was her story.
“Yes, that one is,” I agreed. I opened my folder and pulled out her three pages of plagiarism. “But this is not your article.”
“Yes, it is. I gave it to you.”
“But you didn’t write it.”
“I made it.”
“You made it?”
“I researched it. You said to use research.”
“This is not research. This is two articles from the Internet you’ve put together.”
“I put it together. I wrote it.”
“You didn’t write it.”
“The tutor helped me.”
“This is plagiarism, Kari.”
“You didn’t write these words and yet at the top of the page you write, ‘By Kari M --.’”
“Yeah,” she sighed, “I see your point.” She nodded. Meeting my eyes, she said, “O.K., so I’m going to fail now?”
“Yes. But I still want to get to the bottom of your run-in with security.”
“It doesn’t matter anymore.”
“It matters to me.”
But I didn’t go to the security office. I ran into a senior colleague, a former journalist, and told him about Kari’s arrest. I knew by his puzzlement that he thought I should have already gone to security. This was an important matter.
And yet… that plagiarism.
Instead of walking to the security office after my last class of the day, I walked to the subway and sifted the situation through my head: “I’m going to go to the defense of a plagiarizing student …” (she had plagiarized her first article too; I might’ve thought of that earlier, but when it happened, she had convinced me she had only been confused about using sources) “…a double-plagiarizing student who is illiterate and whom I’m going to fail.”
The next day, a non-teaching day that I spent at home, my conscience gnawing at me, my cowardice sitting up straight at my computer, I wrote an angry email to the security director. I should say -- I have to say -- at the last second I cc’ed the dean of the college on it. (I wanted action, Jackson!) The security director responded immediately by email, thanking me for bringing the matter to his attention and saying he would investigate and get back to me as soon as possible.
Two days passed. I let the weekend go by and on Monday morning when I showed up in the department office, my chair greeted me, shaking her head. “Your student? -- Unbelievable, huh?”
“That she made the whole thing up!”
“You didn’t hear? From the security director? He was pretty upset at you too.”
“She just wanted not to fail, so she made it up.”
I was blinking in disbelief.
“She confessed, Bob!”
Let me confess, at first I thought that explanation was too simple, that Kari must’ve been bullied into saying she had lied... and lied... and lied. Oh, yeah.
I winced, retreated to my office with my tail between my legs and emailed an apology to the security director and the dean.
Bob Blaisdell is a professor of English at City University of New York’s Kingsborough Community College.
Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78
The liberal arts are dead, or — at best — dying. That's the theme of story after story in today’s news media.
Professional skills training is in. The STEM (science, technology, engineering, and math) fields are in. Practical, vocational higher education is in. The liberal arts are out, relics of a “traditional” way of thinking that has been overtaken by the pressing demands of our dizzyingly complex digital age.
As new students arrived on college campuses this fall, the message many of them heard is that majoring in history, or English, or anthropology is a surefire recipe for a life of irrelevance and poor job prospects. These “conventional” disciplines cannot possibly train students for productive, enriching careers in the high-tech information age whose future is now.
Although this viewpoint is rapidly gaining the status of settled wisdom, it is tragically misguided. It is based on a false dichotomy, namely that the liberal arts and the more vocational, preprofessional, practical disciplines — like, say, computer science — are fundamentally different and opposed. But this misunderstands both the age we’re living in and the challenges we face, not to mention one of the most significant trends in higher education over the last few decades — the evolution of interdisciplinarity.
In essence, this whole debate comes down to skills. The liberal arts are often said by critics to provide little that is of “practical value” in the “real world.” In reality, though, liberal arts curriculums can and do give students skills that are just as professionally useful as those in more “relevant” occupationally specific fields of study.
At my university, the University of Maryland-Baltimore County, students this fall can declare a new major called global studies, which integrates courses in 12 liberal arts departments — including economics, geography and environmental systems, history, media and communication studies, and political science — into a rigorous interdisciplinary curriculum. Majors are required to study abroad and to achieve fluency in at least one foreign language. By graduation, they will have demonstrated their research, analytical, critical-thinking, and writing skills in a substantial, “capstone” research project. Our students will also do internships with companies, not-for-profits, and government agencies.
Equally important, they will develop “global competence,” which employers in many professions have identified as one of the most desirable, but grossly lacking, sets of skills required of their new employees. Broadly defined, global competence is “the capacity and disposition to understand and act on issues of global significance.” Its central elements include knowledge of world affairs — cultural, economic, and political; proficiency in communicating with people in and from other societies, both verbally and in writing; the ability to appreciate multiple perspectives and respect cultural diversity; and the intellectual and psychological flexibility to adapt to unfamiliar and rapidly changing circumstances.
Developing the skills that we hope to instill in UMBC’s global studies majors is an inherently interdisciplinary mission. In a recent New York Timescolumn, Yale professor Nicholas Christakis argues that the social sciences (a subset of the liberal arts) badly trail the natural sciences in generating innovative “institutional structures” that can produce the kind of cutting-edge science necessary for solving some of the world’s most intractable — often intrinsically interdisciplinary — problems. However, he also notes that this is beginning to change, for example, in the form of a new global affairs major at Yale.
Whether it’s global studies at UMBC or global affairs at Yale, these exciting new programs tangibly articulate why talking about liberal arts education versus practical training creates the false perception that these two enterprises are essentially at odds. At UMBC, it's the combination of interdisciplinary liberal arts education; substantial research, writing and analysis; rigorous foreign language training; study abroad; and experiential learning in the form of internships and other applied opportunities that will give students the skills they will need to thrive and “do good” in the 21st century.
The tragedy is that we might blow it. If we continue to present students with a false choice between the liberal arts and “real-world” vocational training, we will produce what social scientists like to call “suboptimal” outcomes. Too many talented, energetic, hard-working students will choose “safe” educational and career paths, and too many truly global problems will go unsolved.
Devin T. Hagerty is a professor of political science and director of global studies at the University of Maryland-Baltimore County.
A new analysis by ACT has found that only 36 percent of those who take the ACT and indicate a planned choice of college major are selecting a subject that is a good fit with their stated academic interests. Generally, those students scoring higher on the ACT were found to be more likely to have major choices that were a good fit.
Submitted by Paul Fain on November 8, 2013 - 3:00am
A new report from the Center for American Progress profiles 13 students who are enrolled in a range of competency-based degree programs at seven different institutions. Lawmakers are showing interest in competency-based education of late. In an effort to help shape policies that might emerge, the report tries to uncover commonalities between the students' experiences. It also features several policy recommendations, such as a call for standards of quality and experimentation with financial aid rules.
Two reports were issued Monday on medical education:
The Blue Ribbon Commission for the Advancement of Osteopathic Medical Education issued a report calling for a shift in osteopathic medical education and residencies away from assumptions based on years of study, to an approach based on measuring "readiness" for residency (in medical school) and "readiness for practice" during residency.
New York State received 20 percent of all of Medicare's graduate medical education (GME) funding while 29 states, including some with shortages of physicians, got less than 1 percent, according to a report published by researchers at the George Washington University School of Public Health and Health Services.
While surveys show that most of those who would be first-generation college students want to attend college, a majority are not prepared to succeed in key courses, according to a report released Monday by ACT and the Council for Opportunity in Education (COE). The study found that 52 percent of first-generation 2013 high school graduates who took the ACT met none of the four ACT College Readiness Benchmarks. That compares to 31 percent of all ACT-tested graduates who met none of the benchmarks. Only 9 percent of the first generation students met all four benchmarks, while 26 percent of graduates overall did so. The benchmarks specify the minimum scores students must earn on each of ACT’s four subject tests (English, math, reading, and science) to have a 75 percent chance of earning a grade of C or higher in a typical credit-bearing first-year college course in the corresponding subject area.