Essay on what to do, as an instructor, when you miss a class

Instant Mentor

Rob Weir considers how to handle it, as an instructor, when you can't be there.

Job Tags: 
Editorial Tags: 
Show on Jobs site: 
Image Source: 
Getty Images

Essay on the impact of adaptive and competency based learning on traditional-age students

Several decades ago – long before the level of technological sophistication we experience today -- I was part of a movement begun by the late Julian Stanley, a psychology professor, and the Johns Hopkins University Center for Talented Youth (CTY) to save academically talented youth from boredom in the schools. The most controversial instrument to rescue them was a pedagogical practice called, rather prosaically, "Diagnostic Testing Followed by Prescriptive Instruction" or, shorthand, “DT>PI.” It was principally applied to the pre-collegiate mathematics curriculum and relied on just a few key assumptions and practices:

1. Students already know something about a subject before they formally study it.

2. Test students before a course begins and then just instruct them on what they don’t know.

3. Test students again when you as the instructor and they as learners believe they have competency in a subject.

4. Move immediately to the next level of instruction.

The DT>PI model was placed in a more generous context with the adaptation of Professor Hal Robertson’s (University of Washington) notion of the Optimal Match. Simply stated, pace and level of instruction should match optimally an individual student’s assessed abilities — with the caveat that those accessing that talent would always try to stretch a student beyond his comfort zone. The Optimal Match theoretically could apply to all students at any level of education.

When I used to speak publicly in a wide variety of settings — at colleges and universities, community colleges, schools, education association meetings, parent gatherings -- about what I thought to be the commonsensical notions of DT>PI and the Optimal Match, the reactions were pronounced and fiercely negative. My colleagues and I were accused of presenting educators with the dissolution of the structured classroom as we knew it then; forcing students unjustifiably to proceed educationally without sufficient instructional guidance; destroying the communal, cooperative imperative of an American education; and, producing social misfits because students would finish academic coursework before the schedule established (rather arbitrarily, I might add) by educational professionals for all students of one age at one time. Parents joined often with educators to decry such imagined alienation and damage to a child’s personality.

And then there was a change in 2013.

There are now two closely related pedagogies -- adaptive learning and competency-based learning -- that are embraced by a growing number in higher education as a viable component of educational reform. The Bill and Melinda Gates Foundation is awarding grants to 18 institutions to experiment with 10 different adaptive learning platforms, and President Obama has expressed support for these innovations and urged easing of regulations to make that possible.

In general, adaptive learning uses data-driven information to design coursework that permits students to proceed educationally at their appropriate pace and level. And competency–based learning allows students to be free of "seat time" and flexibly progress as they demonstrate mastery of academic content.

These definitions, when combined, delineate precisely the key components of DT>PI and that of numerous other experiments in self-paced learning over the last few decades. But now, while the naysayers are still out there, an increasing number of for-profits and nonprofits are turning to adaptive and competency-based learning as a component of the next stage of reform in American education.

Why now? Something must have changed in society to accept self-paced, individualized learning when only decades ago it was roundly rejected on pedagogical, ethical and psychological grounds. Those concerns are clearly not as inviolate as they were only years ago. Answering this question might well provide education reformers with insight into what is now possible — even expected -- from students for the learning platforms of the future.

There are at least three reasons why self-paced learning might be more popular now than it was only a few years ago: technological advances, financial exigency and a new self-profile of the learner.

Advances in technology that rely on advances in data mining and data analytics -- predicting future learning behavior by an individual based upon analysis of thousands of earlier learners — permit now a high ability to track, direct, customize, evaluate and advise student learning at instantaneous speeds. What in previous decades seemed to be an impossible task for a teacher or professor to manage in a single course — diverse learning points among students — is at least now technically feasible.

Many institutions are rather intent to find new strategies that will at once reduce their cost of providing an education. Adaptive and competency-based learning are thought to be such "disruptive" opportunities, although how accompanying data-driven, all-knowing and anticipating, high-touch technologies will reduce dramatically both cost and price (tuition) remains elusive.

And, lastly, students have perhaps finally realized the expectation of the self-esteem movement that has dominated instruction in our nation’s schools for several decades. Students might well now believe that they are the center of all activity — to include education — and that they are both the sole focus and the drivers of learning. All instructional effort exists for the purpose of fulfilling their desires.

This "power shift" makes learners, individually — not teachers or professors -- aggregators of knowledge by and for themselves. Any approach to education that places them at the center of learning activity accommodates their perspective on education. Adaptive and competency-based learning accomplish this masterfully. Self-paced, individually adjusted instruction, enhanced by “big data” technologies that guide student progress “lockstep” in a course and beyond, eliminates distracting elements to the individual control of knowledge. Primary among those distractions for students are faculty with their pesky, seemingly inefficient and irrelevant questions.

And thus, in 2013, what was not acceptable several decades ago is now thought a solution to crisis in American education. A combination of new technologies, financial emergency and a shift in who is at the center and in control of learning has caused this to occur.

But all is not settled. The changing circumstances introduce concerns that did not exist decades ago when students were not the arbiters of their own learning, self-paced instruction was not thought to be a solution for all students in American education but only the academically talented and big data did not exist to mine and anticipate every move in student learning.

A defining element of DT>PI was that students must not just study what is the next logical step in a course, but they must through the exhortations of a teacher or professor attempt to go beyond what was thought statistically possible — they must stretch themselves intellectually at every point. Professor Stanley used to constantly quote the line of the poet Robert Browning that one’s reach must always exceed one’s grasp.

Questions remain whether in the absence of a live instructor exhorting a student who is not necessarily academically acute and motivated, students will extend their reach or settle for statistically generated achievement delivered by an electronic adviser (I am referring here to traditional-aged undergraduates, not working adults who are propelled by substantial motivational factors). Such absence of exhortation could be extremely damaging to the majority of American students who often do not naturally attempt to achieve to the levels of which they are capable without personal mentorship.

And one traces in those who are enthralled with "big data" and "data analytics" for solving the maladies of American education a disturbing belief. Student will achieve through data-enhanced technologies the perfectibility of education — perhaps life itself -- by eliminating all resistance, frustration, indecision, trial and error, chance and expenditure of time. For example, Harvard University social scientist and university professor Gary King is quoted in a May 20, 2013 New Yorker article entitled "Laptop U." as saying, “With enough data over a long period, you could crunch inputs and probabilities and tell students, with a high degree of accuracy, exactly which choices and turns to make to get where they wanted to go in life."

And yet, there is growing commentary that it is precisely the absence of frustration, resistance and associated imperfections in a so-called “Me Generation” and its aftermath that is compromising contemporary students' learning and preparation for a life. By educators blithely accepting students’ assertion of self-determination without legitimate maturing experiences (that will include failure and self-doubt) and by arranging learning electronically so that they will make no wrong decisions, they are granting them little ability to deal with inevitable disappointment and frustration in life.

Students are educated without gaining resilience and that is hardly an education of which a nation can be proud or secure, regardless of the utopian promises of the big data enthusiasts. All this reminds me of a call I received decades ago from an entrepreneur who wanted me to comment on his idea of developing a school basketball court that would have the hoop move electronically with the ball so that no student would ever miss a shot and thus, in his words, "suffer humiliation."

So while I am delighted that self-paced education in the form of adaptive and competency-based learning is finally a more generally discussed component of reform in American education, I urge that those advancing it think long and hard about some of the humanly-damaging consequences of learning platforms so perfected by technology that students are offered a Faustian bargain – the comfort of non-resistant and frustration-free learning in exchange for the ultimate loss of a resilience needed for a satisfying life after schooling.


William G. Durden is president emeritus and professor of liberal arts at Dickinson College, and operating partner at Sterling Partners, a private equity company.

Editorial Tags: 
Image Source: 
Getty Images

Essay on the real death of the humanities

In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.

But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.

Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.

I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.

Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.

Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.

With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.

An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.

I was incredibly, indescribably proud of them.

Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?

In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.

And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.

But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.

So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:

"My advice would be to leave it alone."

It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.

While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.

As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.

Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.

After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.

Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.

More professors using social media as teaching tools

Smart Title: 

The majority of faculty members are not using social media in the classroom, a new study finds, but the proportion of professors using social media is increasing. 

The t-shirt many professors would enjoy wearing

Smart Title: 

Each year, David Lydic's students ask him the same questions. So sometimes he has to just let his shirt do the talking.

Class-sourcing as a teaching strategy (essay)

Having your students work in groups to produce publicly accessible digital artifacts helps them learn and instructors be relevant -- and Gleb Tsipursky shows you how to do it.

Editorial Tags: 
Show on Jobs site: 

U. of Kentucky hopes to boost student retention with prescriptive analytics

Smart Title: 

At University of Kentucky, a data-driven approach to student retention involves asking students about their tablet use and sleeping patterns.

Colleges start academic programs

Smart Title: 
  • Daytona State College is starting a bachelor of science in nursing.
  • Rollins College is starting two new undergraduate majors: business with a management concentration, and social entrepreneurship and business.

American adults see online courses as at least equivalent in most ways

Smart Title: 

Gallup survey finds majority of adults see online courses as equal to or better than classroom-based courses in several key ways.

Essay on reforms needed to promote success at community colleges

The lines for advisers begins to form early in the morning in late summer and early fall at my community college. It is August, six days from the start of classes, and we will likely admit and enroll 35 percent of our new fall students in the next week. These students will need orientation and advising and help with financial aid and so will flood into our student center by the hundreds, facing long wait times and frazzled staff.

In another building, on the other side of campus, the academic deans are working on deciding which low-enrollment classes will get cut in a few days. They are waiting until the very last possible minute to let the last third of our incoming class get registered, which means that there will be adjuncts who find out they are unemployed a day, maybe two, before they were scheduled to teach.

Many of our instructors will plan to wait to actually start covering course content until the second or third class session, knowing that there are significant numbers of students who won’t get registered until five or more days after the semester begins. Since we don’t have mandatory placement and our online registration system doesn’t enforce prerequisites some instructors will, instead of beginning to cover content, spend the first few sessions trying to convince underprepared students to drop their class and take the developmental or introductory course they are actually ready for.

Some of our students will sign up for classes but will not have books for a month while they wait for financial aid to process. Some of our students will sign up for classes the day before the semester starts and will miss the first week entirely as they work to find childcare or adjust their work schedules or figure out the bus schedule to get to school.

This is the time of year when, as an administrator of a community college that is committed to providing access, especially to underserved populations, I can’t help but wonder if we are doing more harm than good. When we have taken the charge to provide access to mean that we shouldn’t have any real restrictions on how a student begins their college career, are we really providing opportunity or are we setting our most vulnerable students up to fail?

In the name of access, we and our community college peers across the country (I know that we are not unique in this discussion) have no deadlines for application or financial aid. We make students take assessment tests but then allow them to self-select into whatever classes they wish to take. We let brand-new students, many of whom are first-generation and in need of academic remediation, sign up for classes that have already met two or three times.

We worry over our rising student loan default numbers. We struggle to improve our retention and completion rates and yet we have created a system that makes it O.K. for college to be a last-minute decision, where our most at-risk students start out behind and many never catch up. We force our professors to take students who will be seriously behind on their first day in class, and who will either sidetrack the instructor or fall more behind. Instructors, especially in our core classes, must balance trying to meet the course objectives while also providing in-class remediation for underprepared students.

Our internal data show that there is a strong correlation between late enrollment and academic failure. The vast majority of our students who come to us in late August will be gone well before the end of the semester, many having student loans that will eventually become delinquent. And yet we continue to have practices that are not in the best interest of either the student or the institution.

I propose that it is time to change how we think about access at community colleges. It is time for:

  • Application and enrollment deadlines that ensure a student has enough time to get financial aid and payment plans in place before the semester begins. We need to have deadlines in place so a student knows that being successful requires planning and some time getting his or her life organized to be a student. A student who misses the deadline for enrollment isn’t told "no," they are told "next semester."
  • Mandatory orientation for all new students. We have a moral obligation to ensure that students have been informed of the institutions' expectations, policies and practices before students try to begin navigating our increasingly large bureaucracies.
  • Required placement and advising prior to the first semester of enrollment. Students should start knowing what they’ll need to graduate, what classes they are truly ready for and what their academic plan will be.

Ultimately, it is time for bold leadership that is willing to begin to reframe what access should mean and is willing to put in place policies that might result in some initial enrollment declines in the hopes of better-prepared students in the long term.

There are, literally, a thousand students who will see an adviser in the next week at my college who are, according to our own data, unlikely to succeed, and I can’t help but think we are at least partially responsible for their failure. Something must change.

The author requested anonymity because her bosses at her community college strongly disagree with these ideas, and she doesn't have tenure.

Editorial Tags: 


Subscribe to RSS - teachinglearning
Back to Top