Up to half of new graduates, by some estimates, are finding themselves jobless or underemployed. Why? As Andrew Sum, the director of the Center for Labor Market Studies at Northeastern University said, "Simply put, we’re failing kids coming out of college." Recent pieces in The Atlantic and The Weekly Standard (claiming that the proponents of the liberal arts have "lost the war" and the liberal arts has been "killed.") and elsewhere place much of the blame on liberal arts programs.
Let it be known, I was a student of the liberal arts (geography, Asian studies) at a liberal arts college (Clark University) and I founded and run a technology company in Silicon Valley. I wouldn’t have it any other way. I want our so-called "soft" studies (humanities, social sciences) to show some spine and create a response. The typical defense of the status quo involves spinning the value of a liberal arts education, pitching the curriculum as promoting the ability to problem-solve, learn to learn, and thrive in a knowledge economy. If the curriculum is teaching such skills as adapting to a knowledge economy, why can’t the professors that teach such great skills to thrive in a changing world employ them with some grace and poise? How can the liberal arts, itself, adapt to a changing world?
Simply put, we need to rethink what our students do to demonstrate their understanding. I’m not suggesting that we stop teaching literature and history and economics and psychology – or that students stop majoring in these fields. But we need to ask students to create, to experiment, to be bold and possibly fail with projects and deliverables relevant in today’s world. We’re too limited by Blue Book short essays and term papers -- in which success is easily measured and bell-curved. If we shift the way we ask students to demonstrate their knowledge within liberal arts fields, we can prepare students for employment by advancing the liberal arts.
We can achieve this revitalization by asking students to acquire and demonstrate 21st-century skills as the activities and assessments within the liberal arts curriculum. No longer can we assign formats that are isolated exercises; they need to be projects that communicate with and potentially affect the wider world. While peer-reviewed journal articles and regression analysis may be the way that professors communicate, the rest of the world has updated its formats. Academe, and in particular liberal arts programs, may be on the verge of being left behind.
What skills could we teach and measure in a new liberal arts?
Common ways to communicate now include snappy blog entries, reports, collateral material, diagrams, visualizations, illustrations, and infographics. Even scholarly think tanks that discuss the unemployability of undergraduates, such as the Georgetown Center on Education and the Workforce and the Institute for Higher Education Policy, publish white papers and reports with distinct efforts in graphic design to be distributed for free on the Internet. The Bain Report that famously said a third of all colleges are in poor financial health was released with an interactive website. The term paper should be a dying artifact, and I’m not sure that it is.
Let’s put it this way: as a businessman I wouldn’t pay anyone for a well-written literature review, but I would pay quite handsomely for a brochure that resonates with the audience I am trying to reach. I’d pay more for someone to code it up into a website. Presentations in the work world now model Steve Jobs’ keynotes and TED talks. The governing book on presentation style, Resonate, is filled with directions on communicating bold ideas with simple story structures. The last presentation I saw by an academic was mind-numbingly complex in research and statistical methods. You know what? Nobody paid attention to the research methods; they wanted to understand the key points they should take away, remember, and discuss with others. They were also confused as to why anyone should care in the first place. People walked away feeling like the academic may be thorough and erudite, but that they forgot to communicate in the process.
Liberal arts programs should start with a course on visual communication, and then develop these skills by requiring they be used and demonstrated across the curriculum. These skills include:
Illustration and animation
Oration, rhetoric and narrative
Sketching and drafting
Numeracy and Data Literacy
There are broad advantages to people who can hold their own with math, and this is no longer just about understanding the basics behind a calculator and being able to do accounting. We need to face facts: we teach mathematics as if we’re preparing bookkeepers for the pre-computer world, analysts for big banks, or math and physics professors. But there’s an explosion of jobs that need advanced numeracy and data literacy, with data storage, management, analysis, and visualization techniques all as fundamental skills.
This isn’t a back-room skill set anymore. The job of "data scientist" is being created everywhere simultaneously. If you think the only careers for mathematics are in finance and academe, you can just read about what Facebook expects people on its research team to know. It’s not just tech, either: The New York Times is increasingly using infographics that connect with readers so much someone made a page devoted to them. There is even a startup called visual.ly that’s entirely devoted to producing infographics at scale. If you’ve met anyone going into policy or business after having "thematic" undergraduate coursework, they’ll likely tell you their job encounters statistics and data in ways that make them wish they’d learned more statistics, spreadsheets, analytical software, and other tools that help generate meaning out of all this data.
The new liberal arts should start with and continually ask students to acquire and practice mathematics as a form of analysis and knowledge creation. The necessary skills include:
Data analysis (statistics) and experimentation
Data storage and management
Applied mathematics and mathematical literacy
Estonia just decided all of their first-graders are going to learn to code, and an article in Venture Beat claims that the country will as a result "win the Great Brain Race." The same article says our education system is described as "running on empty when it comes to tech literacy, leaving too many young adults unprepared to compete in a digitally driven economy." Matt Mullenweg, the founder of Wordpress, openly and repeatedly explains, "scripting is the new literacy." Yet, the degrees awarded in computer science dropped in the last decade, and the recent uptick isn’t happening fast enough.
Alternately, we don’t necessarily need more graduates with arcane knowledge of computer science; we need all graduates to be familiar enough with code to use the computer, the Internet, and mobile devices as tools. Academe and the American public need to quit viewing computer science as a geeky back-room endeavor. It has little to do with science, or even computers. Coding is about manipulating information to create meaning, which is likely how you would define writing. After all, there’s a reason they call it a computer "language." Students should understand how to develop these applications on the Web, on mobile devices, and even native to the operating system.
A Call to Action
If you agree with Brian Mitchell from the Edvance Foundation, that "the value of a liberal arts degree ... must be that it is as vital, dynamic, and complex as the civilization that values it," then one must agree that the liberal arts must ask students to engage in work and produce end products that our newly digitized civilization values. And the liberal arts must be as dynamic and vital as its academic proponents claims it to be. I believe it is.
Many liberal arts colleges require a foreign language – not because they believe their history majors will land jobs in France or Mexico, and not because they are being trained as translators, but because they believe the skills learned in a new
language create global citizens who are open to and comfortable with interacting in a multicultural, multilinqual world. It’s the same with the above skills. They need to be understood not as a way to turn philosophy majors into geeks, but into telling the world that a philosophy major can be open to and comfortable with, daresay even take advantage of and thrive in a technologically changing world.
Students who graduate with a degree in liberal arts should understand the basic canon of our civilization as well as their place in the world, sure, but they also need to understand how to explore and communicate their ideas through visual communication, data manipulation, and even making a website or native mobile app. If they can’t, they’ll just understand the global context of their own unemployment.
Michael Staton is founder and chief evangelist of Inigral, Inc. Follow him on Twitter @mpstaton
Submitted by Paul Fain on October 16, 2012 - 3:00am
Merging campus civic engagement and economic development can create "engaged learning economies," which are a boon to both colleges and local communities, according to a new report from Campus Compact, a national coalition of 1,200 college and university presidents. The report describes 25 examples where this has worked, including efforts by Widener University to work with local groups to help improve the economy of low-income Chester, Pa., which is home to the university.
Everyone applauds the idea of critical thinking, and liberal arts colleges often make their ability to teach critical thinking a key selling point. But no one seems to define what they mean by that term.
As I prepared for the start of classes this fall, I tried to pinpoint the critical thinking skills I really want my students to learn. And as I listened to public debates on everything from tax policy to Obamacare, five essential thinking skills seemed to be missing, again and again. So, based on our dysfunctional national dialogue, here are the "core competencies" I hope to instill in my students:
1. The ability to think empirically, not theoretically. By this I mean the habit of constantly checking one's views against evidence from the real world, and the courage to change positions if better explanations come along. I have great admiration for scholars like Richard Muller, the University of California physicist and global warming skeptic, whose work was heavily funded by the conservative Koch brothers. When new, more comprehensive data from his own research team provided convincing evidence of global temperature increases, Muller changed his mind, and later sounded the alarm about carbon dioxide emissions. Unfortunately, however, much of our public debate on many issues seems to be a clash of theoretical world views, with neither side willing to dispassionately examine the evidence or modify their views. In Congress, the individuals most willing to change their minds – the moderates – have been systematically driven out by more extreme candidates who are dedicated to holding fast to their predetermined positions, regardless of subsequent facts.
2. The ability to think in terms of multiple, rather than single, causes. When you drop a book, it will fall on the floor -- a single-cause event. But most of the interesting things in the world have multiple causes; educational success, for example, is affected by a student's aptitude, but also by the educational achievements of the student's parents, the quality of the school he or she attends, and the attitudes and intelligence of the other students in that school. In such cases, simple comparisons become unreliable guides to action, because the effects of intervening variables haven't been screened out. So, for example, judging a president by Reagan's famous question – "Are you better off now than you were four years ago?" – implicitly assumes that presidential actions are the only variable affecting the economy. This is, of course, nonsense – our globalized economy is affected by a huge variety of factors, including exchange rates, oil prices, the fate of the European Union, the strength of the Chinese economy, and so on. In these situations, we need higher-order analysis that adjusts for these external factors to gauge the true effect of a policy.
3. The ability to think in terms of the sizes of things, rather than only in terms of their direction. Our debates are largely magnitude-free, but decisions in a world with constrained resources always demand a sense of the sizes of various effects. For example, President Obama contends that investments in education and infrastructure are crucial to the nation’s future growth. And it makes intuitive sense that better-educated workers would be more productive, and that repaired highways could transport goods to market more quickly and at lower cost. But Republicans are dead-set against new taxes to pay for these investments. In such a polarized situation, the only way to finance these programs would be to borrow money, and these days much of the government’s borrowed funds are supplied by overseas investors from places like China and Japan. The interest payments on government bonds, then, are a real hindrance to economic growth. The wisdom of these investments, therefore, depends critically on the magnitude of the two effects. How big are the payoffs from investments in education and infrastructure? How much of our debt is owned by foreigners, and what interest rate will we have to pay to them? These kinds of debates cannot be solved by looking only at the direction of anticipated effects, because without quantification, we have no basis for comparison of those effects. In politics and policy, size matters.
4. The ability to think like foxes, not hedgehogs. In his seminal book, Expert Political Judgment, Philip Tetlock followed Isaiah Berlin in distinguishing between hedgehogs, who know one big thing and apply that understanding to everything around them, and foxes, who know many small things and pragmatically apply a "grab bag" of knowledge to make modest predictions about the world. In his study of hundreds of foreign policy experts over 20 years, Tetlock showed that foxes outperform hedgehogs in making predictions, and hence tend to make better decisions. But our current political climate favors hedgehogs, because they tend to be more confident, forceful, and predictable in their views. Mitt Romney's choice of Paul Ryan as a running mate can be seen as an attempt by a fox (Romney) to capture some of the allure and excitement surrounding a hedgehog (Ryan).
5. The ability to understand one's own biases. An expanding literature in psychology and behavioral economics suggests that we are full of unconscious biases, and a failure to understand these biases contributes to poor decision-making. Perhaps the most common and dangerous of these is confirmation bias, the tendency to seek out information in accordance with our previous views and ignore or dismiss information contrary to those views. This undermines our ability to weigh the evidence in an evenhanded manner. Our media culture reinforces this problem, as liberals have their MSNBC, The Nation, The New York Times and think tanks like the Center for American Progress, while conservatives have their Fox News, the National Review, The Wall Street Journal and the Heritage Foundation. In the current world, no one need bear the inconvenience of contrary information.
In general, our public debates are textbook examples of non-critical thinking. But these five traits can provide a foundation for a more enlightened dialogue in the future. And students with these skills will think about their world in a deeper, more constructive way.
Paul Gary Wyckoff is professor of government and director of the Public Policy Program at Hamilton College.
Research released Monday by the National Bureau of Economic Research suggests that student performance on tests may be related not only to knowledge gained, but time between significant tasks. The new research -- by Ian Fillmore and Devin G. Pope of the University of Chicago -- examined student performance on Advanced Placement exams. The AP final exams are not always on the same schedule, so students who take more than one AP exam have varying amounts of time between the tests. The study found "strong evidence" that having shorter time periods between exams resulted in lower scores on the second exam. Students who take two exams with 10 days between them are 8 percent more likely to pass both exams than those who take the exams one day apart. An abstract of the study may be found here.
Are you a faculty member or administrator who thinks that the latest technologies are finally going to enable us to teach our students well, or do you at least hope that’s the case? If so, you should reconsider, because the vaunted elements of the latest technologies have been around for some 100 years. It isn’t having the technology, but using the technology that is key to helping students learn well.
For at least the past decade there has been much talk about the advantages of highly sophisticated online courses and the use of online tools in traditional courses. One of the significant advantages of technology-enhanced courses, it is said, is that they can be tailored to individual students’ needs, and thus achieve desired learning outcomes for each student better and faster.
Consider for example, this quote from the website of the Apollo Group, the parent company of the University of Phoenix: "Based upon the belief that learning is not a one-size-fits-all experience, Apollo Technology developed the technology to deliver data-driven, personalized education tailored to the individual. Apollo Technology’s unique student data system collects and analyzes individual student data, and delivers automatic just-in-time guidance that can significantly improve student outcomes." In 2010, the University of Phoenix announced a new Learning Management System, the Learning Genome Project, that "gets to know each of its 400,000 students personally and adapts to accommodate the idiosyncrasies of their 'learning DNA.'" Similarly, a recent article in The New York Times stated: "Because of technological advances — among them, the greatly improved quality of online delivery platforms, the ability to personalize material … MOOCs [massive open online courses] are likely to be a game changer."
These statements are evidence of the general belief that now, using technology, we can achieve all sorts of personalized instruction, which constitutes a revolution in how we can help students learn.
But using technology to individualize student learning is not at all a new idea — it does not originate with online courses or with the technology developments of the past decade, or two, or even three. Using technology to individualize student learning is an idea going back at least 100 years. One of the original learning theorists of the modern era, Edward Thorndike, stated in his 1912 book: "If, by a miracle of mechanical ingenuity, a book could be so arranged that only to him who had done what was directed on page one would page two become visible, and so on, much that now requires personal instruction could be managed by print."
A couple of World Wars later, one of Thorndike’s intellectual descendants, B.F. Skinner, recognized as the most eminent psychologist of the 20th century, was developing and crystallizing the field of operant conditioning, the form of learning in which so-called voluntary behavior changes as a result of its consequences. In the third and final volume of his autobiography, Skinner relates that in 1953, in seeing how his daughters were being educated at the Shady Hill School, "I suddenly realized that something had to be done. Possibly through no fault of her own, the teacher was violating two fundamental principles: the students were not being told at once whether their work was right or wrong (a corrected paper seen 24 hours later could not act as a reinforcer), and they were all moving at the same pace regardless of preparation or ability. But how could a teacher reinforce the behavior of each of 20 or 30 students at the right time and on the material for which he or she was just then ready?.... A few days later I built a primitive teaching machine."
Skinner later developed more sophisticated versions of teaching machines, demonstrating one at the University of Pittsburgh in 1954. These machines presented math problems one at a time, with students having to solve each problem before being able to go on to the next.
In 1961 Skinner took a somewhat different approach to personalized instruction when he published, with Holland, the programmed textbook The Analysis of Behavior. This book focused on the principles of learning, more specifically, the principles of classical (Pavlovian) and operant conditioning, with an emphasis on the latter. The introductory pages of the book, echoing Thorndike in 1912, state that "the material was designed for use in a teaching machine…. Where machines are not available, a programmed textbook such as this may be used. The correct response to each item appears on the following page, along with the next item in the sequence."
Students wrote down their answers before turning the page, and repeated a section if more than 10 percent of the answers in that section were incorrect. I first encountered this book in the summer of 1968, as a 15-year-old student in a psychology course taught under the auspices of the National Science Foundation. Similar to other students in my group that summer, I finished this text within weeks and loved it. In 1964, in seventh grade, I had been the beneficiary of another programmed textbook, English 3200. This book was part of a very successful series that taught English grammar.
Another well-known figure in the origins of operant conditioning, Fred Keller, published his iconic article, "Good-bye Teacher…" in 1968. In this article he essentially advocates breaking down the entire teaching process to its elements, and conducting each of those elements more efficiently. The prime function of the teacher becomes, not to lecture, which is best left to automated means, but to engage in direct interaction with students in support of their individualized instruction. More specifically, Keller points out as important the following teaching elements:
1. Highly individualized instruction that allows students to progress at their own speed.
2. Clear specification of learning outcomes (the specific skills to be achieved).
3. Clear specification of the steps needed to achieve these learning outcomes.
4. A goal of perfection for each student and for each stage in the learning process.
5. Two types of teachers: Classroom teachers whose duties include "guiding, clarifying, demonstrating, testing, grading," and other teachers who deal with "course logistics, the interpretation of training manuals, the construction of lesson plans and guides, the evaluation of student progress, the selection of [classroom teachers], and the writing of reports for superiors."
6. Using lectures as little as possible — more as a way to motivate students, and using student participation as much as possible.
7. Lots of testing, all with immediate feedback to students, which helps to ensure student learning.
This breakdown of the learning process makes large parts of that process, parts that are ordinarily done in classrooms involving direct human interaction, well suited for being done by technology. However, humans are clearly still needed for specifying the learning outcomes and the steps required to reach them, as well as other tasks involving analysis and creativity and complex interactions with students.
Just a few years later, in the fall of 1972, I took an undergraduate course on learning at Harvard University, taught by William Baum, that followed the "Keller plan." The work was divided into 26 units, each requiring some reading, some questions to which answers had to be found and learned (50 to 80 such questions per unit, some of which would require an essay to really answer properly), and a written and an oral quiz. Students were not allowed to progress to the next unit until they had passed the written and oral quizzes for the preceding unit, and individual instruction with Baum or his graduate teaching assistant was always available. However, due to the large number of units in this 14-week course, and the difficulty of the quizzes, which students often did not pass, very few students finished the entire sequence and so very few students received an A. Thus using the Keller method does not automatically result in students doing well. The application of such teaching techniques is critical.
Lest anyone think that visions of improving learning by the use of technology are limited to psychologists, 1995 saw the publication of an outstanding work of science fiction by Neal Stephenson, The Diamond Age. A central theme in this work is an interactive book, owned by a small girl, that greatly facilitates her learning, development, and upbringing. We cannot yet achieve the degree of device interactivity that Stephenson describes, but we can achieve elements of that interactivity, and Stephenson gives us a vision of the possibilities.
In 1998, Frank Mayadas, then a program director at the Sloan Foundation, gave the keynote address at the City University of New York’s Baruch College’s first annual Teaching and Technology Conference. In this address he pointed out that all forms of college learning have three elements in common: an expert, who oversees the process; information sources; and colleagues, with whom a student learns. All three are important in the learning process, and all three may be instantiated in different ways depending on the modality of instruction. Although current technology cannot by itself design a new course, it can serve well as an information source, and it can assume some of the functions of colleagues. As technology continues to develop, the functions that it can serve will increasingly closely resemble those that have traditionally been served by humans.
The more recent past, 2010, saw the publication of DIY U by Anya Kamenetz. Consistent with Keller in 1968 and Mayadas in 1998, Kamenetz also would separate the components of the learning process, instead of concentrating them all in a course’s single professor as has been largely the case until now. In her vision of the future, individualized instruction is assumed, with technology playing a significant role, including by taking over those parts of teaching that can be automated.
Kamenetz’s vision is not far away given what is already happening on today’s campuses. As stated in a 2012 report from the Ithaka organization, "Barriers to Adoption of Online Learning Systems in U.S. Higher Education": "Literally for the first time in centuries, faculty and administrators are questioning their basic approach to educating students. The traditional model of lectures coupled with smaller recitation sections (sometimes characterized as 'the sage on the stage') is yielding to a dizzying array of technology-enabled pedagogical innovations." One primary use of technology is to deliver lecture material outside of class, while class time is used for discussion and other active interactions involving the instructor and the students. This is known as the flipped classroom, which turns "traditional education on its head." But recall Keller’s 1968 suggestions about how teachers should be used for "guiding, clarifying, demonstrating, testing, grading," and that lectures should be "used as little as possible … and student participation as much as possible." It seems that the new invention of the flipped classroom is not so new at all.
What encourages these recent statements about the benefits of technology for learning is a worldwide recognition that what is important in higher education is the achievement of specific, agreed-upon learning outcomes. Although this emphasis was present at least from 1912 in the work of learning theorists such as Thorndike, who emphasize the end result — the behavioral goal — in their approach to changing behavior, it has only been in the past few decades that such recognition has become prominent in higher education.
One example is contained within what is known as the Spellings Report (the 2006 report of the commission that was appointed by then-Secretary of Education Margaret Spellings). A major point of this report was that "[a]ccreditation agencies should make performance outcomes, including completion rates and student learning, the core of their assessment as a priority over inputs or processes." It is this emphasis on learning outcomes that, in part, enables the use of technology in the learning process. Once the learning outcomes are specified, the process of helping students to achieve them can be programmed, using increasingly sophisticated technology.
Many of the elements of good teaching discussed here — for example, individualized instruction, frequent testing, focus on outcomes, immediate feedback — now have sound laboratory evidence to support their use (see a comprehensive survey here). We seem to have forgotten their behavioral psychology origins and history, yet it is their effectiveness that is important in the end. Perhaps there are additional lessons to be learned from behavioral scientists, however, in the use of technology to facilitate instruction. We have only to look at casino attendees, particularly the users of slot machines, to see evidence of what Skinner and Keller knew firsthand in the laboratory with rats, that animals (including humans) respond at a high, continuous, persistent rate on variable ratio schedules (situations in which each reward arrives after a variable number of responses). Using such knowledge, in addition to knowledge from cognitive psychology about how best to structure concepts, can result in online courses that not only make concepts easy to learn and remember but, similar to slot machines, are almost irresistibly attractive.
Keller in 1968 summed up his position on teaching with the following:
Twenty-odd years ago, when white rats were first used as laboratory subjects in the introductory course, a student would sometimes complain about his animal’s behavior. The beast couldn’t learn, he was asleep, he wasn’t hungry, he was sick, and so forth. With a little time and a handful of pellets, we could usually show that this was wrong. All that one needed to do was follow the rules. “The rat,” we used to say, “is always right.”
My days of teaching are over. But … I learned one very important thing: the student is always right. He is not asleep, not unmotivated, not sick, and he can learn a great deal if we provide the right contingencies of reinforcement.
Although we can all agree that college students are certainly not the same as casino attendees or lab rats, we can also all agree that technology, designed and used correctly, can facilitate instruction through personalization as well as through motivation. (The popular appeal of many online role-playing games is one example of that.)
The teaching techniques and tools discussed here have been promoted by behavioral psychologists for the past century. What lessons can we learn from this? One is that it is possible to facilitate learning using the techniques discussed here, such as personalized instruction, without ever having to use the latest (very expensive) technology. There are times when a relatively cheap programmed textbook will help someone learn, perhaps not as well as the best online programs, but very well.
A related lesson is that it is not the existence of the latest technology or its potential uses that will help us to maximize student learning, but using what we know and have. Faculty must be both aware of the techniques and tools at their disposal, and want to use them. This requires proper training during graduate school, professional development later on, and appropriate college and university incentive structures (all of which have been too often missing if the repeated rediscovery of these techniques and tools during the past century is any indication).
The sorts of tools that we have needed to help students learn have been around for 100 years, albeit continuously improved. It is our job to — finally — use those tools.
Alexandra W. Logue is executive vice chancellor and provost of the City University of New York.