Virtually everywhere you turn, somebody is promoting the idea that technology is a – if not the -- solution to educational completion. Panelists at conferences, politicians, foundation officials and journalists/bloggers promote the view. It is also being supported loudly by the checkbooks of the venture capitalist community. College completion is, without a doubt, a serious problem. In fact, for the first time, the current generation of Americans entering the work force is less educated than the generation that is now retiring.
I run an educational technology company, and I read the articles, sit on the panels, and see the venture money flowing. But I have to admit, my first thought is: “Might technology be the problem rather than the solution?”
College retention and completion is a growing and serious problem in the U.S. However, understanding how technology helps in education, particularly higher education, can be very difficult to identify and measure. When searching for technology solutions, we should consider the concept of appropriate technology -- using the right amount of technology to solve a core problem.
Does it address the core problem?
Is it scalable?
Is it maintainable?
Is it affordable?
We already know several non-technology solutions that are working. Most administrators will agree that good teachers, engaging instruction, individual mentoring and personal advising can directly affect retention and student performance. The problem with these known solutions is cost, time and measurability. Faculty and staff are often burdened with administrative and mundane tasks that infringe upon effective student engagement.
This presents a real opportunity for technology. However, it must be put to work in the right way.
Rather than looking for technology to replace or augment the teacher/student relationship, we can look for ways technology can eliminate everything that is NOT the teacher/student relationship – reducing time spent on administrative tasks and increasing the information available about the individual students and their needs. I call this the "other ed tech."
If technology can free up time for teachers by helping to find open educational resources, streamlining grading, simplifying student/parent communication, and eliminating HR tasks, it will create more time for student interaction. If technology can automate student advising communication and help to identify students at risk it will create more targeted opportunities for effective intervention. If technology can eliminate administrative and institutional overhead it will help to create more effective time and funds for student-facing services. (Disclosure: My company, IData, Inc., helps colleges with some of these things.)
To understand my reaction to the push for technology as a panacea in education, I reflect nearly 20 years ago to when I volunteered as a teacher at St. Cecilia Mautuma Secondary, a small, rural school in the highlands of Kenya. It was a new, four-room, secondary boarding school for girls. This school had almost nothing in terms of technology – a handful of textbooks shared between classes of 25 students, chalkboards that never seemed to have chalk and an hour of electricity from a car battery to run lights so students could study at night. A number of my friends in the U.S. suggested computers or software to help the girls of Mautuma. The reality was that they needed more textbooks, more teachers and possibly … more chalk.
My time in Kenya introduced me to many Peace Corps volunteers. The Peace Corps operates under the principle of appropriate technology – loosely defined as technology that is locally affordable with locally made/maintained tools that greatly reduce labor requirements and provide new opportunities for productivity.
In essence, if I had dropped a laptop in the middle of Kenya in 1993, it would not have solved anything for those students. There was no electricity, no Internet, no way to fix it and no way to share the resource. Internet technology would not have helped learning in rural Kenya in 1993 because it was not scalable, it was not locally maintainable, it was too expensive and it did not solve the core problems of not enough teachers, not enough books, not enough light to study at night and not enough parents that could afford the modest annual school fees.
Twenty years later, is there a correlation between my experience in Kenya and the current trends in educational technology? Clearly, 21st-century U.S. higher education is different, but we should still consider scalability, maintainability, affordability and whether the solution is solving the core problem.
As education technology remains a hot topic with conversations surrounding MOOCs, big data, mobile apps and open educational resources, we should ask ourselves the following questions:
Are we throwing the right solutions at the problems of higher education?
Do we even understand the problems?
Is there a plan?
Does it help to fulfill the goals of the strategic plan?
As schools look for a technology plan, they should focus on the goals outlined in their strategic plan and look for innovation on processes that free up resources that we can use for things we know work.
As active participants in the education world, we should always be looking for ways to appropriately apply technology. There are real problems, and a good start would be to focus on saving time and money. Budget is one of the biggest barriers to giving teachers and staff the one-on-one time needed to keep students on track. There are a large number of tasks that are done by individual schools that could benefit from cost-sharing with peer institutions. Projects like the Predictive Analytics in Retention (PAR) Framework are a great example of multiple schools collaborating together to build a single (and better) retention analytics platform.
Ed tech projects can be time and money losers for a school. The guiding principal should be to look carefully at every dollar or hour spent NOT focused on working with students or advancing your strategic plan. If any of those hours or dollars can be eliminated with technology, that seemsappropriate.
Brian S. Parish is owner and president of IData, Inc., which helps colleges manage administrative data.
In "Howl," a blistering poetical rant and perhaps the most important poem of the 60’s counterculture, Allen Ginsberg anatomizes the minds of his generation. They are young men and women who "studied Plotinus Poe St. John of the Cross telepathy and bop kabbalah because the cosmos instinctively vibrated at their feet in Kansas." When students come to our offices to consider studying the humanities, we can all recite the litany of reasons for doing so. It provides them with the critical thinking skills needed for success in any career; it endows them with the cultural capital of the world’s great civilizations; and it helps them explore what it means to be human.
But for those of us who have spent our lives studying the humanities, such reasons are often just the fossilized remains of the initial impulse that set us on our educational journey -- the feeling that Kansas was vibrating at our feet, and that to chart our futures we desperately needed to understand the meaning of that vibration.
The main challenge for the humanities teacher has always been to show how the great works of philosophy, literature, religion, history, and art answer to the good vibrations in our young people. But at the dawn of the 21st century the academic scaffolding of the humanities thwarts this fundamental goal. The central problem is that the Harvard University model of humanistic study dominates academia.
The Harvard model sees the humanities as a set of distinct and extensively subdivided disciplines, overseen by hyper-specialized scholars who produce disciplinary monographs of extraordinary intellectual subtlety and technical expertise. Though the abstruse work produced with this model periodically makes it the butt of media jokes, no one with an appreciation for good scholarship would want to eliminate the rigorous discipline represented by the work of scholars at Harvard and institutions like it. But neither should it be allowed to dominate the agenda of all higher education, which it now incontestably does, to the detriment of both the humanities and the students who want to understand the meaning of their unique vibration.
The disciplining of knowledge was central to the creation of the modern research university. In the second half of the 19th century, Harvard and then schools across the academic landscape dropped their common curriculum, creating instead departments and majors. Beginning with the natural sciences of physics, chemistry, and biology, this flowering of disciplines issued in countless discoveries and insights with repercussions far beyond the university. Flushed with this success, this triumph of knowledge production, and the 19th century scientific methodology that was its seed, spread to the examination of society. The newly-invented social sciences -- economics, sociology, anthropology and the like — grabbed hold of the explosive new problems that followed in the wake of modern industrial life. But at the same time they marginalized the traditional questions posed in the humanities. The social sciences raised "humanistic" questions within the strictures of 19th century positivist assumptions about scientific "objectivity," and they have been doing so, despite post-modern blows dealt to claims of objectivity, ever since.
As the natural and social sciences divided the world between themselves the humanities threatened to become a mere leftover, a rump of general reflections and insights that lacked the rigor of the special sciences. Eager to be properly scientific themselves, and thereby forestall such a humiliating fate, the humanities disciplined themselves. They sought to emulate the success of the sciences by narrowing their intellectual scope, dividing and subdividing their disciplines into smaller and ever smaller scholarly domains, and turning themselves into experts.
The norm became the creation of inward-looking groups of experts who applied a variety of analytic approaches to sets of increasingly technical problems. In short, the humanities found themselves squeezed by the demands for professionalization and disciplinization, the need to become another regional area of study analogous in form, if not in content, to the other special sciences. And the humanities have been content to play this disciplinary game ever since.
In the last 30 years, the rise of Theory promised to breathe a new, post- modern life into this disciplinary game. By the mid-20th century, the sterility of old fashioned explication de texte was becoming apparent. The linguistic turn opened up a new way for the humanists to ape the rigor of the sciences while simultaneously extending their scholarly turf. In their zeal for technical rigor, they discovered to their delight that texts marvelously shift shape depending upon the theoretical language used in their analyses. Into the moribund body of the humanities flowed the European elixirs of psychoanalysis, phenomenology and hermeneutics, structuralism and post-structuralism, all of which boasted technical vocabularies that would make a quantum physicist blush. With these languages borrowed from other disciplines, the great books of the Western tradition looked fresh and sexy, and whole new fields of scholarship opened up overnight.
At the same moment, however, scholars of the humanities outside the graduate departments of elite universities suddenly found themselves under-serving their students. For the impulse that drives young people to the humanities is not essentially scholarly. The cult of expertise inevitably muffles the jazzy, beating heart of the humanities, and the students who come to the university to understand their great vibration return home unsatisfied. Or worse, they turn into scholars themselves, funneling what was an enormous intellectual curiosity through the pinhole of a respectable scholarly specialty.
Indeed, their good vibrations fade into a barely discernable note, a song they recall only with jaded irony, a sophisticated laugh at the naiveté of their former selves, as if to go to school to learn the meaning of their own lives were an embarrassing youthful enthusiasm. The triumph of irony among graduate students in the humanities, part of the deformation professionelle characteristic of the Harvard virus, exposes just how far the humanities have fallen from their original state. As they were originally conceived, the humanities squirm within the research paradigm and disciplinary boxes at the heart of the Harvard model.
The term "humanities" predates the age of disciplinary knowledge. In the Renaissance, the studia humanitatis formed part of the attempt to reclaim classical learning, to serve the end of living a rich, cultivated life. Whether they were contemplative like Petrarch or engaged like Bruni, Renaissance humanists devoted themselves to the study of grammar, rhetoric, logic, history, literature, and moral philosophy, not simply as scholars, but as part of the project of becoming a more complete human being.
Today, however, the humanities remain entrenched in an outmoded disciplinary ideology, wedded to an academic model that makes it difficult to discharge this fundamental obligation to the human spirit. Despite the threat of the Great Recession, the rise of the for-profit university, and a renewed push for utility the humanities continue to indulge their fetish of expertise and drive students away. Some advocate going digital, for using the newest techno and cyber techniques to improve traditional scholarly tasks, like data-mining Shakespeare. Others turn to the latest discoveries in evolutionary psychology to rejuvenate the ancient texts. But both of these moves are inward looking — humanists going out into the world, only to return to the dusty practices that have led the humanities to their current cul-de-sac. In so doing, colleges and univeristies across the country continue to follow the Harvard model: specialize, seek expertise, and turn inward.
When Descartes and Plotinus and Poe and St. John of the Cross created their works of genius, they were responding not to the scholar’s task of organizing and arranging, interpreting and evaluating the great works of the humanistic tradition, but rather to their own Kansas. Descartes and Rousseau were latter-day Kerouacs, wandering Europe in search of their souls. These men and women produced their works of genius through a vibrant, vibrating attunement to the needs of their time.
The Humanities! The very name should call up something wild. From the moment Socrates started wandering the Greek market and driving Athenian aristocrats to their wits end, their place has always been out in the world, making connections between the business of living and the higher reaches of one’s own thought, and drawing out implications from all that life has to offer. The genius of the humanities lies in the errant thought, the wild supposition, the provocation -- in Ginsburg’s howl at society. What this motley collection of disciplines is missing is an appreciation of the fact that the humanities have always been undisciplined, that they are essentially non-disciplinary in nature. And if we want to save them, they have to be de-disciplined and de-professionalized.
De-disciplining the humanities would transform both the classroom and the curriculum. Disengaging from the Harvard model would first and foremost help us question the assumption that a scholarly expert in a particular discipline is the person best suited to teaching the subject. The quality that makes a great scholar — the breadth and depth of learning in a particular, narrow field — does not make a great teacher; hungry students demand much more than knowledge. While the specialist is hemming himself in with qualifications and complications, the broadly-educated generalist zeros in on the vital nub, the living heart of a subject that drives students to study.
While a scholarly specialist is lecturing on the ins and outs of Frost’s irony, the student sweats out his future, torn between embracing his parent’s dream of having a doctor in the family or taking the road less traveled and becoming a poet. The Harvard model puts great scholars in charge of classrooms that should be dominated by great teachers. And if the parents who are shelling out the price of a contemporary college education knew their dollars were funding such scholarly hobbyhorses, they would howl in protest.
De-disciplining the humanities would also fundamentally change the nature of graduate and undergraduate education. At the University of North Texas Department of Philosophy and Religious Studies, located in the Dallas Metroplex, we are training our graduate students to work with those outside their discipline — with scientists, engineers, and policy makers — to address some of the most pressing environmental problems the country faces. We call it field philosophy: taking philosophy out into the world to hammer out solutions to highly complex and pressing social, political, and economic problems. Graduate students participate in National Science Foundation grants and practice the delicate skill of integrating philosophic insights into public policy debates, often in a "just-in-time" manner. In class they learn how to frame and reframe their philosophical insights into a variety of rhetorical formats, for different social, political, economic purposes, audiences and time constraints.
At Calumet College of St. Joseph, an urban, Roman Catholic commuter college south of Chicago that serves underprepared, working-class Hispanic, African-American, and Anglo students, we are throwing the humanities into the fight for social justice. Here the humanities are taught with an eye toward creating not a new generation of scholars, but a generation of humanely educated citizens working to create a just society. At Calumet, students are required to take a social justice class.
In it they learn the historical and intellectual roots of Catholic social justice teaching within the context of performing ten hours of community service learning. They work in a variety of social service fields (e.g. children, the elderly, homeless, etc.), which exposes them to the real-life, street-level experience of social challenges. Before, during, and after, students bring this experience back to the classroom to deepen it through reflective papers and class discussion.
High-level humanistic scholarship will always have a place within the academy. But to limit the humanities to the Harvard model, to make scholarship rather than, say, public policy or social justice, the highest ideal of humanistic study, is to betray the soul of the humanities. To study the humanities, our students must learn textual skills, the scholarly operations of reading texts closely, with some interpretive subtlety. But the humanities are much more than a language game played by academic careerists.
Ultimately, the self-cultivation at the heart of the humanities aims to develop the culture at large. Unless they end up where they began -- in the marketplace, alongside Socrates, questioning, goading, educating, and improving citizens -- the humanities have aborted their mission. Today, that mission means finding teachers who have resisted the siren call of specialization and training undergraduate and graduate students in the humanities in the art of politics.
The humanist possesses the broad intellectual training needed to contextualize social problems, bring knowledge to bear on social injustice, and translate disciplinary insights across disciplines. In doing so, the humanist helps hold together an increasingly disparate and specialized society. The scholasticism of the contemporary academy is anathema to this higher calling of the humanities.
We are not all Harvard, and nor should we want to be.
ChrisBuczinsky is head of the English program at Calumet College of St. Joseph in Whiting, Indiana. Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas.
Massive open online courses (MOOCs) have captured the nation’s imagination. The notion of online classes enrolling more than 100,000 students is staggering. Companies are springing up to sponsor MOOCs, growing numbers of universities are offering them, and the rest of America’s colleges are afraid they will be left behind if they don’t.
But MOOCs alone are unlikely to reshape American higher education. When history looks back on them, they may receive no more than a footnote. However, they mark a revolution in higher education that is already occurring and which will continue.
America is shifting from a national, analog, industrial economy to a global, digital, information economy. Our social institutions, colleges and universities included, were created for the former. Today they all seem to be broken. They work less well than they once did. Through either repair or replacement — more likely a combination — they need to be refitted for a new age.
Higher education underwent this kind of evolution in the past as the United States shifted from an agricultural to an industrial economy. The classical agrarian college, imported from 17th-century England with a curriculum rooted in the Middle Ages, was established to educate a learned clergy to govern the colonies. This model held sway until the early 19th century.
In the years before the Civil War, the gap between colleges and society grew larger. European higher education modernized, creating models that would inspire America to grow our own. Innovations, mostly small, were attempted; many failed. During and after the war, the scale of experimentation increased with the founding of universities such as Cornell University, Johns Hopkins University and the University of Chicago a few decades later. Other institutions, such as Harvard University, remade themselves. The innovations spread. By the mid-20th century a new model of higher education for an industrial era coalesced. It was codified in California’s 1960 master plan, balancing selectivity with access and workforce development.
This transition brought new institutions that better met the needs of an industrializing America.
An entity called the university was imported from Germany, with what would become a mission of teaching, research and service. It offered instruction in professions essential to an industrial society, organized knowledge into relevant specialties, and hired expert faculty in those areas. It not only transmitted the knowledge of the past, but advanced the frontiers of knowledge for the future.
The federal government created the land-grant college to bridge between the old agrarian America and the emerging worlds, agrarian and industrial America. Now found in all 50 states, the land-grant college was designed to provide instruction in agriculture and the mechanic arts without excluding classical studies.
Specialized institutions emerged. Some, like the Massachusetts Institute of Technology, were modeled on the European polytechnics; they promoted industrial science and technology and prepared leaders in these fields. Others, the normal schools, sought to provide more and better teachers as the evolving economy demanded more education of its citizenry.
The two-year college — originally called a junior college, later a community college, sometimes Democracy’s College — was initially established to offer lower-division undergraduate education in the local community.
As these institutions emerged, the curriculum changed. Graduate studies were introduced. New professional schools in fields like engineering, business and education became staples. Continuing education and correspondence courses were added. Elective courses and majors arose. Disputation, recitation, and memorization, the teaching methods of the agrarian college, gave way to lectures, seminars, and laboratories.
The colleges that persisted adopted many of the era’s changes, and the classical curriculum largely disappeared.
This is the history of higher education in America. Change has occurred by accretion. The new has been added to the old and the old, over time, modernized. Change occurs with no grand vision of the system that the future will require. New ideas are tried; some succeed, many fail. By successive approximations, what emerges is the higher education system necessary to serve the evolved society.
Social change is a constant, and so is the need for higher education to adapt to it. When the change in society is deleterious, as in the McCarthy era, it is the responsibility of higher education to resist it and right the society. It is a natural process, almost like a dance. However, in times of massive social change like the transformation of America to an information economy, a commensurate transformation on the part of higher education is required.
We are witnessing precisely that today. MOOCs, like the university itself or graduate education or technology institutes, are one element of the change. They may or may not persist or be recognizable in the future that unfolds.
What does seem probable is this. As in the industrial era, the primary changes in higher education are unlikely to occur from within. Some institutions will certainly transform themselves as Harvard did after the Civil War, but the boldest innovations are likelier to come from outside or from the periphery of existing higher education, unencumbered by the need to slough off current practice. They may be not-for-profits, for-profits or hybrids. Names like Western Governors University, Coursera, and Udacity leap to mind.
We are likely to see one or more new types of institution emerge. As each economic and technological revolution creates new needs for higher education, unique institutions emerge to meet them. In the agrarian era, only a tiny percentage of the population needed higher education, and the college served these elite few. When industrial America required more education, more research, and mass access to college, two major institutions were established: the university and the community college.
The information economy, which requires a more educated population than ever before in history, will seek universal postsecondary education and is likely to create new institutions to establish college access for all at low cost. These institutions will operate globally, not locally, which will dictate a digital format. Because information economies emphasize time-variable, common outcomes — unlike the industrial era’s common processes and fixed times (think assembly lines) — universal-access institutions will offer individualized, time-variable instruction, rooted in mastery of explicit learning outcomes. Degrees and credits are likely to give way to competency certification and badges.
Traditional higher education institutions — universities and colleges—will continue, evolving as did their colonial predecessors. Their numbers will likely decline. At greatest risk will be regional, part-time commuter universities and less-selective, low-endowment private colleges, particularly in New England, the Mid-Atlantic, and the Midwest. The future of the community college and its relationship to the universal-access university is a question mark. It is possible that sprawling campuses will shed real estate in favor of more online programs, more compact learning centers and closer connections with employers and other higher education units.
In this era of change, traditional higher education—often criticized for being low in productivity, being high in cost, and making limited use of technology — will be under enormous pressure to change.
Policy makers and investors are among those forces outside of education bringing that pressure to bear. It’s time for higher education to be equally aware and responsive.
Arthur Levine, a former president of Teachers College, Columbia University, is president of the Woodrow Wilson National Fellowship Foundation.
Pearson VUE, which operates a worldwide network of testing centers for various exams, has been experiencing significant technical problems this week. The company's Facebook page features numerous comments from people unable to take their scheduled exams or to get information about when they will be able to do so. Some people are posting stories of how hours-long delays likely affected their performance on exams that are crucial to their careers. On the Facebook page, Pearson indicates that it is aware of the problems and is trying to fix them.
"We are continuing our efforts to restore normal service as quickly as possible. We are in the midst of implementing recommendations by our internal and external technology experts, but it is too soon to know how quickly this will improve system performance. Please note that there will likely be additional variations in system performance as we implement these changes," says a statement posted Thursday evening. "We fully appreciate that many of you have been significantly impacted by the circumstances over the past several days, and we will increase testing capacity and operational support to accommodate scheduling and/or rescheduling of those affected as quickly as possible once normal system performance is restored."
While many educators and politicians say that colleges need to increase science and technology enrollments to meet workforce demands, a study being released today suggests that there is no shortage of STEM workers. The study -- by the Economic Policy Institute, a nonpartisan but liberal leaning think tank -- finds that:
Students have already responded to the interest in STEM by majoring in science and technology fields in sufficient numbers to meet workforce demands.
Only one of every two STEM graduates finds a job in a related field.
In computer and information science and in engineering, colleges in the United States are graduating 50 percent more students each year than there are jobs in those fields.
Of computer science graduates who do not enter the IT workforce, 32 percent say it is because they could not find an IT job, and 53 percent say they found better jobs outside of IT.
My first encounter with assessment came in the form of a joke. The seminary where I did my Ph.D. was preparing for a visit from the Association of Theological Schools, and the dean remarked that he was looking forward to developing ways to quantify all the students' spiritual growth. By the time I sat down for my first meeting on assessment as a full-time faculty member in the humanities at a small liberal arts college, I had stopped laughing. Even if we were not setting out to grade someone’s closeness to God on a scale from 1 to 10, the detailed list of "learning outcomes" made it seem like we were expected to do something close. Could education in the liberal arts — and particularly in the humanities — really be reduced to a series of measurable outputs?
Since that initial reaction of shock, I have come to hold a different view of assessment. I am suspicious of the broader education reform movement of which it forms a part, but at a certain point I asked myself what my response would be if I had never heard of No Child Left Behind or Arne Duncan. Would I really object if someone suggested that my institution might want to clarify its goals, gather information about how it’s doing in meeting those goals, and change its practices if they are not working? I doubt that I would: in a certain sense it’s what every institution should be doing. Doing so systematically does bear significant costs in terms of time and energy — but then so does plugging away at something that’s not working. Paying a reasonable number of hours up front in the form of data collection seems like a reasonable hedge against wasting time on efforts or approaches that don’t contribute to our mission. By the same token, getting into the habit of explaining why we’re doing what we’re doing can help us to avoid making decisions based on institutional inertia.
My deeper concerns come from the pressure to adopt numerical measurements. I share the skepticism of many of my colleagues that numbers can really capture what we do as educators in the humanities and at liberal arts colleges. I would note, however, that there is much less skepticism that numerical assessment can capture what our students are achieving — at least when that numerical assessment is translated into the alphabetical form of grades. In fact, some have argued that grades are already outcome assessment, rendering further measures redundant.
I believe the argument for viewing grades as a form of outcome assessment is flawed in two ways. First, I simply do not think it’s true that student grades factor significantly in professors’ self-assessment of how their courses are working. Professors who give systematically lower grades often believe that they are holding students to a higher standard, while professors who grade on a curve are simply ranking students relative to one another. Further, I imagine that no one would be comfortable with the assumption that the department that awarded the best grades was providing the best education — many of us would likely suspect just the opposite.
Second, it is widely acknowledged that faculty as a whole have wavered in their dedication to strict grading, due in large part to the increasingly disproportionate real-world consequences grades can have on their students’ lives. The "grade inflation" trend seems to have begun because professors were unwilling to condemn a student to die in Vietnam because his term paper was too short, and the financial consequences of grades in the era of ballooning student loan debt likely play a similar role today. Hence it makes sense to come up with a parallel internal system of measurement so that we can be more objective.
Another frequently raised concern about outcome assessment is that the pressure to use measures that can easily be compared across institutions could lead to homogenization. This suspicion is amplified by the fact that many (including myself) view the assessment movement as part of the broader neoliberal project of creating “markets” for public goods rather than directly providing them. A key example here is Obamacare: instead of directly providing health insurance to all citizens (as nearly all other developed nations do), the goal was to create a more competitive market in an area where market forces have not previously been effective in controlling costs.
There is much that is troubling about viewing higher education as a competitive market. I for one believe it should be regarded as a public good and funded directly by the state. The reality, however, is that higher education is already a competitive market. Even leaving aside the declining public support for state institutions, private colleges and universities have always played an important role in American higher education. Further, this competitive market is already based on a measure that can easily be compared across institutions: price.
Education is currently a perverse market where everyone is in a competition to charge more, because that is the only way to signal quality in the absence of any other reliable measure of quality. There are other, more detailed measures such as those collected by the widely derided U.S. News & World Report ranking system — but those standards have no direct connection to pedagogical effectiveness and are in any case extremely easy to game.
The attempt to create a competitive market based on pedagogical effectiveness may prove unsuccessful, but in principle, it seems preferable to the current tuition arms race. Further, while there are variations among accrediting bodies, most are encouraging their member institutions to create assessment programs that reflect their own unique goals and institutional ethos. In other words, for now the question is not whether we’re measuring up to some arbitrary standard, but whether institutions can make the case that they are delivering on what they promise.
Hence it seems possible to come up with an assessment system that would actually be helpful for figuring out how to be faithful to each school or department’s own goals. I have to admit that part of my sanguine attitude stems from the fact that Shimer’s pedagogy embodies what independent researchers have already demonstrated to be “best practices” in terms of discussion-centered, small classes — and so if we take the trouble to come up with a plausible way to measure what the program is doing for our students, I’m confident the results will be very strong. Despite that overall optimism, however, I’m also sure that there are some things that we’re doing that aren’t working as well as they could, but we have no way of really knowing that currently. We all have limited energy and time, and so anything that can help us make sure we’re devoting our energy to things that are actually beneficial seems all to the good.
Further, it seems to me that strong faculty involvement in assessment can help to protect us from the whims of administrators who, in their passion for running schools "like a business," make arbitrary decisions based on their own perception of what is most effective or useful. I have faith that the humanities programs that are normally targeted in such efforts can easily make the case for their pedagogical value, just as I am confident that small liberal arts schools like Shimer can make a persuasive argument for the value of their approach. For all our justified suspicions of the agenda behind the assessment movement, none of us in the humanities or at liberal arts colleges can afford to unilaterally disarm and insist that everyone recognize our self-evident worth. If we believe in what we’re doing, we should welcome the opportunity to present our case.
Adam Kotsko is assistant professor of humanities at Shimer College.