LaGuardia Community College's enhanced GED preparation program substantially boosts GED pass rates and the likelihood of college enrollment, according to a newly released study by MDRC, a nonprofit social research firm. Students in the program, which is designed to serve as a pathway to college and careers, were more than twice as likely to pass the high school equivalency exam as were students in traditional GED prep courses. They were also three times as likely to enroll in college.
People who hire and supervise others in the real world are desperate to hire people — our graduates — who have the "whole package": substantive knowledge plus "soft" skills (basic responsibility, working well with others, ethics, etc.) that contribute to success in the world of work. You might argue that teaching those skills isn't our problem because we’re providing educational foundations for professional knowledge. Or that we can hardly be held responsible for failings of families and society, which ought to be the ones instilling work ethic and manners and common sense.
Still, didn’t we open this can of worms ourselves when we started arguing that colleges and universities are engines of economic development and that government should keep (or go back to) investing in education because it creates a knowledgeable workforce? When employers complain about what they perceive as a lazy and entitled attitude among young workers, and we see an apparently never-ending stream of ethics scandals, maybe there’s another way to think about this that is directly congruent with our mission and, furthermore, falls directly within our expertise: embedding ethics and concepts of professional responsibility throughout our curriculums and courses.
If you think about it, doing so is a positive and preventive approach to what many perceive as an epidemic of cheating. There is research suggesting that an educational approach can be an effective strategy, and if enough faculty members purposefully and thoughtfully incorporate ethical connections into classes, it will help those among our students who mean well and want to follow the rules. If we can help those students to find a voice and provide positive examples, we gain, too.
Over the years, I’ve heard countless arguments about why faculty cannot or do not include ethics in their courses, or add courses about professional responsibility to their disciplines. The curriculum is too full already, and besides, you cannot teach people not to lie and cheat if they didn’t learn that in their families. The objections I hear go further, though, and betray a serious discomfort, fear even, about teaching "ethics": I don’t want to have to talk about deontology (I don’t like Kant or haven’t read it and don’t want to); it’s too hard or too subjective; I’m not qualified; someone else can handle it (bosses, the research compliance people, someone across the street, whatever). Ethics is boring and dry. I don’t know enough and don’t have time to go learn another field while I’m working on getting promoted/getting the next grant/serving on too many committees. What if someone asks a question and I don’t know the answer? What if I look stupid? I might come off as judgmental or not judgmental enough. A required event is going to get really bad student evaluations.
We Can All Teach This Stuff, and We Should
As higher education experiences disruptive transformation through the changing economics of what we do, price pressures and technological upending, homing in on what we uniquely do is likely to be part of our path to the future. What is more central to that than helping students explore questions about and learn to use responsibly the knowledge we are conveying? The responsibilities of professionals — researchers, scientists, scholars, teachers — are deeply personal ones, and too important to leave to others outside our disciplines to teach. Outsourcing shortchanges our students and ourselves.
If you think matters of professional responsibility in your discipline matter, if you care about accountability and transparency and fairness and rigor, you can and should teach ethics in your field, whether that’s a course or workshop that meets the requirements for responsible conduct of research education or topics that you integrate into your substantive classes — or both.
There are good reasons to teach in courses that are not about ethics, and it needn't be daunting or hard. There are some straightforward ways to do it and as a practicing professional in your field (they pay you to do what you do at work, right?), you can and you should. Here’s how.
1. Think and talk about your mistakes. Who hasn’t made a mistake at work? A big one? An embarrassing one? One you still cringe thinking about? What did you learn from those mistakes? If you’ve thought about it over the years, can you talk about it, obviously not naming names if that would violate confidences or confidentiality requirements?
How did you learn about, for example: How to deal with a student or colleague who disappoints you or violates your trust? What to and, even more importantly, what not to do when you make a serious professional mistake?
Have you ever looked back on something that seemed perfectly reasonable at the time, and with the value of hindsight, thought "How could I have been such an idiot?" Or, been sitting with someone who’s making a huge mistake and thought "no, no, no!"
If you can find a way to talk about those moments and the lessons you took away from them, your students will learn. Talking calmly and clearly about mistakes you have made will shape them as professionals and as people — and not so coincidentally, the world you are going to live in when they take over. (Another plus: modeling how you deal with hard stuff, and showing that life and careers rarely go in a clean, clear forward path without setbacks will be memorable and they will like you all the more for it.)
2. Articulate one of the lessons that govern your professional life. Where and when did you learn about the value of boundaries and when to refer students to other resources rather than trying to help them yourself? That it’s easier to start out relatively strictly in a course and relax the rules as you go than vice versa? That’s a lesson that extrapolates to a lot of other contexts. How did you learn to set the ground rules for talking to reporters about your work or setting boundaries when acting as a consultant or expert witness? When have you made a hard choice about a professional topic that you found challenging? If the lesson is connected to a mistake, it will be even more gripping to your class.
If you ask the students make a connection to the topic you’re teaching that day, you will likely be surprised and pleased with what emerges. And even if your examples are all from your life in academe, the examples will likely have relevant lessons for students looking at other careers.
3. Talk with students about ethical dilemmas or hard moments they’ve faced (or will face). For years, I’ve asked students to write a short (200 word) description of an ethical dilemma they have faced. (This is an assignment idea from Harris Sondak of the University of Utah, a friend of a friend who was kind enough to talk with me about his teaching techniques and syllabus when I first started teaching ethics in a business school.) Not only does this essay get students thinking about these issues in their own lives, properly managed it creates a wonderful set of discussion topics.
Even if you don’t ask students to do exactly that, or if you adapt and ask them to write about ethical applications of your topic or questions they have, it will tell you a lot about where the students are. In the dilemmas I’ve gotten over the years, the same issues come up over and over again: bosses who put pressure on workers to cut corners to meet deadlines. Perverse incentives in reward systems. Peer pressure. Temptation and rationalization in the face of a desire to succeed. You know, all those human frailties that come up when you work with other people.
And not one of those is hard to connect to the kinds of problems our students will face in what they do after college or grad school. Believe me, they are all cued into power imbalances, fairness, and how to navigate difficult situations. Connect it to how you use what you’re teaching, even if you only do that once in a while, even if it’s only talking about your policy for awarding grades, and you’ll be contributing to their development in a broader way.
Students who’ve never held a job have faced dilemmas in school, like a friend who asked for help with an assignment when it was against the rules to collaborate. That situation is relevant to most every class and a great place to use it is it when you’re discussing the syllabus, especially if that’s all you do on your first day (contrary to advice offered here).
If you’re nervous about flying blind, take a look at the range of ethics resources, including “two-minute challenge” (2MC) collection on Ethics CORE. What’s a 2MC? It’s a problem that you cannot necessarily resolve in two minutes, but comes up and you may need to respond to it in two minutes — or less. It’s the kind of problem that comes up all the time in professional life and you need to be prepared to handle. Use the same simple framework for structuring discussion of your own or other ethical dilemmas.
Don’t come prepared with the “answer,” and do come prepared to point out that you already know what you would do in hard situations (mostly), and that you won’t be going to work with them, so it’s THEIR answers that matter the most. If you are going to opine or editorialize, do it only after they’ve all had their say. Prepare a few questions to keep the discussion going, using the framework as your basis for that.
If you do that, based on real problems people (in the room sometimes!) have faced, you’ll be doing some of the most important things that emerging research on efficacy in ethics education suggest: using short examples that carry emotional punch because they happened to real people. Modeling a way to talk about them. Helping to analyze them by practicing. Over and over. (If any of them are musicians or athletes, ask them to talk about the value of practicing scales or free throws for a useful analogy.)
You’ll be helping your students to anticipate consequences of various actions. Apply labels to what the problems are (deception, temptation, rationalization, slippery slope problems…).
Or pick articles out of the newspaper or journals in your field about someone who’s crossed the line. If you cannot find something, go to Ethics CORE and look at the recent news feed. There won’t be a shortage of examples. Look for the videos. Try out some of the role plays there. Read my most recent book and use some of those examples.
There are lessons that your students will learn from you directly about professional responsibility that you can teach better than anyone else: How you deal with temptation. What to do in the face of a bureaucracy truly stupid rules. What’s the difference between exceeding a 55 mph speed limit and a regulation that 55 parts per million is the allowable limit for contamination in a sample (thanks to Bob Wengert of the University of Illinois philosophy department for that example). How you decide what’s right and what’s wrong. How you act on it. What you’re willing to sacrifice for your principles. (Are they really principles if you’re not willing to sacrifice for them?)
You are a practicing professional. Who better than you to teach your students about professional ethics in your field?
C.K. Gunsalus is the director of the National Center for Professional and Research Ethics, professor emerita of business, and research professor at the Coordinated Sciences Laboratory at the University of Illinois at Urbana-Champaign. She is the author of The Young Professional's Survival Guide(Harvard University Press).
Generation Xers (people who are now in their late 30s) are embracing the idea of lifelong learning, according to a new study by the University of Michigan. The study found that 1 in 10 GenXers are currently enrolled in classes to continue their educations. And 48 percent of the 80 million GenXers take continuing education courses, in-service training or workshops required for professional licenses and certifications.
Virtually everywhere you turn, somebody is promoting the idea that technology is a – if not the -- solution to educational completion. Panelists at conferences, politicians, foundation officials and journalists/bloggers promote the view. It is also being supported loudly by the checkbooks of the venture capitalist community. College completion is, without a doubt, a serious problem. In fact, for the first time, the current generation of Americans entering the work force is less educated than the generation that is now retiring.
I run an educational technology company, and I read the articles, sit on the panels, and see the venture money flowing. But I have to admit, my first thought is: “Might technology be the problem rather than the solution?”
College retention and completion is a growing and serious problem in the U.S. However, understanding how technology helps in education, particularly higher education, can be very difficult to identify and measure. When searching for technology solutions, we should consider the concept of appropriate technology -- using the right amount of technology to solve a core problem.
Does it address the core problem?
Is it scalable?
Is it maintainable?
Is it affordable?
We already know several non-technology solutions that are working. Most administrators will agree that good teachers, engaging instruction, individual mentoring and personal advising can directly affect retention and student performance. The problem with these known solutions is cost, time and measurability. Faculty and staff are often burdened with administrative and mundane tasks that infringe upon effective student engagement.
This presents a real opportunity for technology. However, it must be put to work in the right way.
Rather than looking for technology to replace or augment the teacher/student relationship, we can look for ways technology can eliminate everything that is NOT the teacher/student relationship – reducing time spent on administrative tasks and increasing the information available about the individual students and their needs. I call this the "other ed tech."
If technology can free up time for teachers by helping to find open educational resources, streamlining grading, simplifying student/parent communication, and eliminating HR tasks, it will create more time for student interaction. If technology can automate student advising communication and help to identify students at risk it will create more targeted opportunities for effective intervention. If technology can eliminate administrative and institutional overhead it will help to create more effective time and funds for student-facing services. (Disclosure: My company, IData, Inc., helps colleges with some of these things.)
To understand my reaction to the push for technology as a panacea in education, I reflect nearly 20 years ago to when I volunteered as a teacher at St. Cecilia Mautuma Secondary, a small, rural school in the highlands of Kenya. It was a new, four-room, secondary boarding school for girls. This school had almost nothing in terms of technology – a handful of textbooks shared between classes of 25 students, chalkboards that never seemed to have chalk and an hour of electricity from a car battery to run lights so students could study at night. A number of my friends in the U.S. suggested computers or software to help the girls of Mautuma. The reality was that they needed more textbooks, more teachers and possibly … more chalk.
My time in Kenya introduced me to many Peace Corps volunteers. The Peace Corps operates under the principle of appropriate technology – loosely defined as technology that is locally affordable with locally made/maintained tools that greatly reduce labor requirements and provide new opportunities for productivity.
In essence, if I had dropped a laptop in the middle of Kenya in 1993, it would not have solved anything for those students. There was no electricity, no Internet, no way to fix it and no way to share the resource. Internet technology would not have helped learning in rural Kenya in 1993 because it was not scalable, it was not locally maintainable, it was too expensive and it did not solve the core problems of not enough teachers, not enough books, not enough light to study at night and not enough parents that could afford the modest annual school fees.
Twenty years later, is there a correlation between my experience in Kenya and the current trends in educational technology? Clearly, 21st-century U.S. higher education is different, but we should still consider scalability, maintainability, affordability and whether the solution is solving the core problem.
As education technology remains a hot topic with conversations surrounding MOOCs, big data, mobile apps and open educational resources, we should ask ourselves the following questions:
Are we throwing the right solutions at the problems of higher education?
Do we even understand the problems?
Is there a plan?
Does it help to fulfill the goals of the strategic plan?
As schools look for a technology plan, they should focus on the goals outlined in their strategic plan and look for innovation on processes that free up resources that we can use for things we know work.
As active participants in the education world, we should always be looking for ways to appropriately apply technology. There are real problems, and a good start would be to focus on saving time and money. Budget is one of the biggest barriers to giving teachers and staff the one-on-one time needed to keep students on track. There are a large number of tasks that are done by individual schools that could benefit from cost-sharing with peer institutions. Projects like the Predictive Analytics in Retention (PAR) Framework are a great example of multiple schools collaborating together to build a single (and better) retention analytics platform.
Ed tech projects can be time and money losers for a school. The guiding principal should be to look carefully at every dollar or hour spent NOT focused on working with students or advancing your strategic plan. If any of those hours or dollars can be eliminated with technology, that seemsappropriate.
Brian S. Parish is owner and president of IData, Inc., which helps colleges manage administrative data.
In "Howl," a blistering poetical rant and perhaps the most important poem of the 60’s counterculture, Allen Ginsberg anatomizes the minds of his generation. They are young men and women who "studied Plotinus Poe St. John of the Cross telepathy and bop kabbalah because the cosmos instinctively vibrated at their feet in Kansas." When students come to our offices to consider studying the humanities, we can all recite the litany of reasons for doing so. It provides them with the critical thinking skills needed for success in any career; it endows them with the cultural capital of the world’s great civilizations; and it helps them explore what it means to be human.
But for those of us who have spent our lives studying the humanities, such reasons are often just the fossilized remains of the initial impulse that set us on our educational journey -- the feeling that Kansas was vibrating at our feet, and that to chart our futures we desperately needed to understand the meaning of that vibration.
The main challenge for the humanities teacher has always been to show how the great works of philosophy, literature, religion, history, and art answer to the good vibrations in our young people. But at the dawn of the 21st century the academic scaffolding of the humanities thwarts this fundamental goal. The central problem is that the Harvard University model of humanistic study dominates academia.
The Harvard model sees the humanities as a set of distinct and extensively subdivided disciplines, overseen by hyper-specialized scholars who produce disciplinary monographs of extraordinary intellectual subtlety and technical expertise. Though the abstruse work produced with this model periodically makes it the butt of media jokes, no one with an appreciation for good scholarship would want to eliminate the rigorous discipline represented by the work of scholars at Harvard and institutions like it. But neither should it be allowed to dominate the agenda of all higher education, which it now incontestably does, to the detriment of both the humanities and the students who want to understand the meaning of their unique vibration.
The disciplining of knowledge was central to the creation of the modern research university. In the second half of the 19th century, Harvard and then schools across the academic landscape dropped their common curriculum, creating instead departments and majors. Beginning with the natural sciences of physics, chemistry, and biology, this flowering of disciplines issued in countless discoveries and insights with repercussions far beyond the university. Flushed with this success, this triumph of knowledge production, and the 19th century scientific methodology that was its seed, spread to the examination of society. The newly-invented social sciences -- economics, sociology, anthropology and the like — grabbed hold of the explosive new problems that followed in the wake of modern industrial life. But at the same time they marginalized the traditional questions posed in the humanities. The social sciences raised "humanistic" questions within the strictures of 19th century positivist assumptions about scientific "objectivity," and they have been doing so, despite post-modern blows dealt to claims of objectivity, ever since.
As the natural and social sciences divided the world between themselves the humanities threatened to become a mere leftover, a rump of general reflections and insights that lacked the rigor of the special sciences. Eager to be properly scientific themselves, and thereby forestall such a humiliating fate, the humanities disciplined themselves. They sought to emulate the success of the sciences by narrowing their intellectual scope, dividing and subdividing their disciplines into smaller and ever smaller scholarly domains, and turning themselves into experts.
The norm became the creation of inward-looking groups of experts who applied a variety of analytic approaches to sets of increasingly technical problems. In short, the humanities found themselves squeezed by the demands for professionalization and disciplinization, the need to become another regional area of study analogous in form, if not in content, to the other special sciences. And the humanities have been content to play this disciplinary game ever since.
In the last 30 years, the rise of Theory promised to breathe a new, post- modern life into this disciplinary game. By the mid-20th century, the sterility of old fashioned explication de texte was becoming apparent. The linguistic turn opened up a new way for the humanists to ape the rigor of the sciences while simultaneously extending their scholarly turf. In their zeal for technical rigor, they discovered to their delight that texts marvelously shift shape depending upon the theoretical language used in their analyses. Into the moribund body of the humanities flowed the European elixirs of psychoanalysis, phenomenology and hermeneutics, structuralism and post-structuralism, all of which boasted technical vocabularies that would make a quantum physicist blush. With these languages borrowed from other disciplines, the great books of the Western tradition looked fresh and sexy, and whole new fields of scholarship opened up overnight.
At the same moment, however, scholars of the humanities outside the graduate departments of elite universities suddenly found themselves under-serving their students. For the impulse that drives young people to the humanities is not essentially scholarly. The cult of expertise inevitably muffles the jazzy, beating heart of the humanities, and the students who come to the university to understand their great vibration return home unsatisfied. Or worse, they turn into scholars themselves, funneling what was an enormous intellectual curiosity through the pinhole of a respectable scholarly specialty.
Indeed, their good vibrations fade into a barely discernable note, a song they recall only with jaded irony, a sophisticated laugh at the naiveté of their former selves, as if to go to school to learn the meaning of their own lives were an embarrassing youthful enthusiasm. The triumph of irony among graduate students in the humanities, part of the deformation professionelle characteristic of the Harvard virus, exposes just how far the humanities have fallen from their original state. As they were originally conceived, the humanities squirm within the research paradigm and disciplinary boxes at the heart of the Harvard model.
The term "humanities" predates the age of disciplinary knowledge. In the Renaissance, the studia humanitatis formed part of the attempt to reclaim classical learning, to serve the end of living a rich, cultivated life. Whether they were contemplative like Petrarch or engaged like Bruni, Renaissance humanists devoted themselves to the study of grammar, rhetoric, logic, history, literature, and moral philosophy, not simply as scholars, but as part of the project of becoming a more complete human being.
Today, however, the humanities remain entrenched in an outmoded disciplinary ideology, wedded to an academic model that makes it difficult to discharge this fundamental obligation to the human spirit. Despite the threat of the Great Recession, the rise of the for-profit university, and a renewed push for utility the humanities continue to indulge their fetish of expertise and drive students away. Some advocate going digital, for using the newest techno and cyber techniques to improve traditional scholarly tasks, like data-mining Shakespeare. Others turn to the latest discoveries in evolutionary psychology to rejuvenate the ancient texts. But both of these moves are inward looking — humanists going out into the world, only to return to the dusty practices that have led the humanities to their current cul-de-sac. In so doing, colleges and univeristies across the country continue to follow the Harvard model: specialize, seek expertise, and turn inward.
When Descartes and Plotinus and Poe and St. John of the Cross created their works of genius, they were responding not to the scholar’s task of organizing and arranging, interpreting and evaluating the great works of the humanistic tradition, but rather to their own Kansas. Descartes and Rousseau were latter-day Kerouacs, wandering Europe in search of their souls. These men and women produced their works of genius through a vibrant, vibrating attunement to the needs of their time.
The Humanities! The very name should call up something wild. From the moment Socrates started wandering the Greek market and driving Athenian aristocrats to their wits end, their place has always been out in the world, making connections between the business of living and the higher reaches of one’s own thought, and drawing out implications from all that life has to offer. The genius of the humanities lies in the errant thought, the wild supposition, the provocation -- in Ginsburg’s howl at society. What this motley collection of disciplines is missing is an appreciation of the fact that the humanities have always been undisciplined, that they are essentially non-disciplinary in nature. And if we want to save them, they have to be de-disciplined and de-professionalized.
De-disciplining the humanities would transform both the classroom and the curriculum. Disengaging from the Harvard model would first and foremost help us question the assumption that a scholarly expert in a particular discipline is the person best suited to teaching the subject. The quality that makes a great scholar — the breadth and depth of learning in a particular, narrow field — does not make a great teacher; hungry students demand much more than knowledge. While the specialist is hemming himself in with qualifications and complications, the broadly-educated generalist zeros in on the vital nub, the living heart of a subject that drives students to study.
While a scholarly specialist is lecturing on the ins and outs of Frost’s irony, the student sweats out his future, torn between embracing his parent’s dream of having a doctor in the family or taking the road less traveled and becoming a poet. The Harvard model puts great scholars in charge of classrooms that should be dominated by great teachers. And if the parents who are shelling out the price of a contemporary college education knew their dollars were funding such scholarly hobbyhorses, they would howl in protest.
De-disciplining the humanities would also fundamentally change the nature of graduate and undergraduate education. At the University of North Texas Department of Philosophy and Religious Studies, located in the Dallas Metroplex, we are training our graduate students to work with those outside their discipline — with scientists, engineers, and policy makers — to address some of the most pressing environmental problems the country faces. We call it field philosophy: taking philosophy out into the world to hammer out solutions to highly complex and pressing social, political, and economic problems. Graduate students participate in National Science Foundation grants and practice the delicate skill of integrating philosophic insights into public policy debates, often in a "just-in-time" manner. In class they learn how to frame and reframe their philosophical insights into a variety of rhetorical formats, for different social, political, economic purposes, audiences and time constraints.
At Calumet College of St. Joseph, an urban, Roman Catholic commuter college south of Chicago that serves underprepared, working-class Hispanic, African-American, and Anglo students, we are throwing the humanities into the fight for social justice. Here the humanities are taught with an eye toward creating not a new generation of scholars, but a generation of humanely educated citizens working to create a just society. At Calumet, students are required to take a social justice class.
In it they learn the historical and intellectual roots of Catholic social justice teaching within the context of performing ten hours of community service learning. They work in a variety of social service fields (e.g. children, the elderly, homeless, etc.), which exposes them to the real-life, street-level experience of social challenges. Before, during, and after, students bring this experience back to the classroom to deepen it through reflective papers and class discussion.
High-level humanistic scholarship will always have a place within the academy. But to limit the humanities to the Harvard model, to make scholarship rather than, say, public policy or social justice, the highest ideal of humanistic study, is to betray the soul of the humanities. To study the humanities, our students must learn textual skills, the scholarly operations of reading texts closely, with some interpretive subtlety. But the humanities are much more than a language game played by academic careerists.
Ultimately, the self-cultivation at the heart of the humanities aims to develop the culture at large. Unless they end up where they began -- in the marketplace, alongside Socrates, questioning, goading, educating, and improving citizens -- the humanities have aborted their mission. Today, that mission means finding teachers who have resisted the siren call of specialization and training undergraduate and graduate students in the humanities in the art of politics.
The humanist possesses the broad intellectual training needed to contextualize social problems, bring knowledge to bear on social injustice, and translate disciplinary insights across disciplines. In doing so, the humanist helps hold together an increasingly disparate and specialized society. The scholasticism of the contemporary academy is anathema to this higher calling of the humanities.
We are not all Harvard, and nor should we want to be.
ChrisBuczinsky is head of the English program at Calumet College of St. Joseph in Whiting, Indiana. Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas.
Massive open online courses (MOOCs) have captured the nation’s imagination. The notion of online classes enrolling more than 100,000 students is staggering. Companies are springing up to sponsor MOOCs, growing numbers of universities are offering them, and the rest of America’s colleges are afraid they will be left behind if they don’t.
But MOOCs alone are unlikely to reshape American higher education. When history looks back on them, they may receive no more than a footnote. However, they mark a revolution in higher education that is already occurring and which will continue.
America is shifting from a national, analog, industrial economy to a global, digital, information economy. Our social institutions, colleges and universities included, were created for the former. Today they all seem to be broken. They work less well than they once did. Through either repair or replacement — more likely a combination — they need to be refitted for a new age.
Higher education underwent this kind of evolution in the past as the United States shifted from an agricultural to an industrial economy. The classical agrarian college, imported from 17th-century England with a curriculum rooted in the Middle Ages, was established to educate a learned clergy to govern the colonies. This model held sway until the early 19th century.
In the years before the Civil War, the gap between colleges and society grew larger. European higher education modernized, creating models that would inspire America to grow our own. Innovations, mostly small, were attempted; many failed. During and after the war, the scale of experimentation increased with the founding of universities such as Cornell University, Johns Hopkins University and the University of Chicago a few decades later. Other institutions, such as Harvard University, remade themselves. The innovations spread. By the mid-20th century a new model of higher education for an industrial era coalesced. It was codified in California’s 1960 master plan, balancing selectivity with access and workforce development.
This transition brought new institutions that better met the needs of an industrializing America.
An entity called the university was imported from Germany, with what would become a mission of teaching, research and service. It offered instruction in professions essential to an industrial society, organized knowledge into relevant specialties, and hired expert faculty in those areas. It not only transmitted the knowledge of the past, but advanced the frontiers of knowledge for the future.
The federal government created the land-grant college to bridge between the old agrarian America and the emerging worlds, agrarian and industrial America. Now found in all 50 states, the land-grant college was designed to provide instruction in agriculture and the mechanic arts without excluding classical studies.
Specialized institutions emerged. Some, like the Massachusetts Institute of Technology, were modeled on the European polytechnics; they promoted industrial science and technology and prepared leaders in these fields. Others, the normal schools, sought to provide more and better teachers as the evolving economy demanded more education of its citizenry.
The two-year college — originally called a junior college, later a community college, sometimes Democracy’s College — was initially established to offer lower-division undergraduate education in the local community.
As these institutions emerged, the curriculum changed. Graduate studies were introduced. New professional schools in fields like engineering, business and education became staples. Continuing education and correspondence courses were added. Elective courses and majors arose. Disputation, recitation, and memorization, the teaching methods of the agrarian college, gave way to lectures, seminars, and laboratories.
The colleges that persisted adopted many of the era’s changes, and the classical curriculum largely disappeared.
This is the history of higher education in America. Change has occurred by accretion. The new has been added to the old and the old, over time, modernized. Change occurs with no grand vision of the system that the future will require. New ideas are tried; some succeed, many fail. By successive approximations, what emerges is the higher education system necessary to serve the evolved society.
Social change is a constant, and so is the need for higher education to adapt to it. When the change in society is deleterious, as in the McCarthy era, it is the responsibility of higher education to resist it and right the society. It is a natural process, almost like a dance. However, in times of massive social change like the transformation of America to an information economy, a commensurate transformation on the part of higher education is required.
We are witnessing precisely that today. MOOCs, like the university itself or graduate education or technology institutes, are one element of the change. They may or may not persist or be recognizable in the future that unfolds.
What does seem probable is this. As in the industrial era, the primary changes in higher education are unlikely to occur from within. Some institutions will certainly transform themselves as Harvard did after the Civil War, but the boldest innovations are likelier to come from outside or from the periphery of existing higher education, unencumbered by the need to slough off current practice. They may be not-for-profits, for-profits or hybrids. Names like Western Governors University, Coursera, and Udacity leap to mind.
We are likely to see one or more new types of institution emerge. As each economic and technological revolution creates new needs for higher education, unique institutions emerge to meet them. In the agrarian era, only a tiny percentage of the population needed higher education, and the college served these elite few. When industrial America required more education, more research, and mass access to college, two major institutions were established: the university and the community college.
The information economy, which requires a more educated population than ever before in history, will seek universal postsecondary education and is likely to create new institutions to establish college access for all at low cost. These institutions will operate globally, not locally, which will dictate a digital format. Because information economies emphasize time-variable, common outcomes — unlike the industrial era’s common processes and fixed times (think assembly lines) — universal-access institutions will offer individualized, time-variable instruction, rooted in mastery of explicit learning outcomes. Degrees and credits are likely to give way to competency certification and badges.
Traditional higher education institutions — universities and colleges—will continue, evolving as did their colonial predecessors. Their numbers will likely decline. At greatest risk will be regional, part-time commuter universities and less-selective, low-endowment private colleges, particularly in New England, the Mid-Atlantic, and the Midwest. The future of the community college and its relationship to the universal-access university is a question mark. It is possible that sprawling campuses will shed real estate in favor of more online programs, more compact learning centers and closer connections with employers and other higher education units.
In this era of change, traditional higher education—often criticized for being low in productivity, being high in cost, and making limited use of technology — will be under enormous pressure to change.
Policy makers and investors are among those forces outside of education bringing that pressure to bear. It’s time for higher education to be equally aware and responsive.
Arthur Levine, a former president of Teachers College, Columbia University, is president of the Woodrow Wilson National Fellowship Foundation.
Pearson VUE, which operates a worldwide network of testing centers for various exams, has been experiencing significant technical problems this week. The company's Facebook page features numerous comments from people unable to take their scheduled exams or to get information about when they will be able to do so. Some people are posting stories of how hours-long delays likely affected their performance on exams that are crucial to their careers. On the Facebook page, Pearson indicates that it is aware of the problems and is trying to fix them.
"We are continuing our efforts to restore normal service as quickly as possible. We are in the midst of implementing recommendations by our internal and external technology experts, but it is too soon to know how quickly this will improve system performance. Please note that there will likely be additional variations in system performance as we implement these changes," says a statement posted Thursday evening. "We fully appreciate that many of you have been significantly impacted by the circumstances over the past several days, and we will increase testing capacity and operational support to accommodate scheduling and/or rescheduling of those affected as quickly as possible once normal system performance is restored."