techfaculty

Pulse podcast features interview with Brad Koch of Blackboard Learn

Smart Title: 

This month's edition of The Pulse podcast features an interview with Brad Koch, vice president for product development at Blackboard Learn.

Essay on the real meaning of the debate over an early notification system for students

Signals has had a rough few months. Blog posts, articles, and pointed posts on social media have recently taken both the creators and promoters of the tool to task for inflated retention claims and for falling into common statistical traps in making those claims. Some of the wounds are self-inflicted — not responding is rarely received well. Others, however, are misunderstandings of the founding goals and the exciting next phases of related work.

Signals is a technology application originally created by a team at Purdue University that uses a basic rules-based set of predictions and triggers — based on years of educational research and insight from the university's faculty — and combines them with real-time activity of students in a given course. It then uses a “traffic light” interface with student that sends them a familiar kind of message:

  • Green light: you’re doing well and on the right track.
  • Yellow light: you’re a little off-track, you might want to think about x,y, or z or talk with someone.
  • Red light: you’re in trouble. You probably need to reach out to someone for help.

These same data are also shared with faculty and administrators so they can reach out to those who need specific support in overcoming an academic challenge or just extra encouragement. Signals is now part of the services offered by Ellucian, but just about all the major players in education technology offer some version of "early warning" applications. In our insight and analytics work, a number of colleges and universities are piloting related apps; however, the promise and problems of Signals are an important predicate as we move forward with that work.

The Signals app project began with a clear and compelling goal: to allow students, faculty, and advisers access to data that might help them navigate the learning journey. For too long, the key data work in education has been focused on reporting, accreditation, or research that leads to long reports that few people see and are all too often used to make excuses, brag, blame, or shame. More problematic, most of these uses happen after courses are over or worse, after students have already dropped out.

The Signals team was trying to turn that on its head by gathering some useful feedback data we know from research may help students navigate a given course, and to give more information to faculty and administrators dedicated to helping them in the process. The course-level outcomes were strong. More students earned As, fewer got Fs, and the qualitative comments made it clear that many students appreciated the feedback. A welcome “wake-up call,” many called it.

John Campbell, then the associate vice president of information technology at Purdue, was committed to this vision. In numerous presentations he argued that “Signals was an attempt to take what decades of educational research was saying was important — tighter feedback loops — and create a clean, simple way for students, faculty, and advisers to get feedback that would be useful.”

Signals was a vital, high-profile first step in the process of turning the power of educational data work toward getting clean, clear, and useable information to the front lines. It was a pioneer in this work and should be recognized as such. The trouble is the conflation of this work with large-scale retention and student success efforts. Claiming a 21 percent long-term retention lift, as some at Purdue have, is a significant stretch at best. However, Signals has shown itself to be a useful tool to help students navigate specific courses, and for faculty and staff striving to supporting them. And while that will likely be useful in long-term retention, there is still much work to be done to both bring Signals to the next level of utility in courses and to test its impact on larger student success initiatives.

First, as Campbell, now CIO at West Virginia University notes, Signals has to truly leverage analytics. In our recent conversation he posited, “The only way to bring apps like Signals to their full potential, to bring them to scale, to make them sustainable is through analytics.” Front-line tools like Signals have to be powered by analyses that bring better and more personalized insight into individual students based on large-scale, consistently updated, student-level predictive models of pathways through a given institution. Put simply, basing the triggers and tools of these front-line systems on blunt, best-practice rules is not personalized, but generalized. It’s probably useful, but not optimal for that individual student. There needs to be a “next generation” of Signals, as Campbell notes, one that is more sophisticated and personalized.

For example, with a better understanding of the entire student pathway derived from analytics anchored on individual-level course completion, retention, and graduation predictions, a student who was highly likely to struggle in a given course from day one — e.g., a student having consistent difficulty with writing-intensive courses who is trying to take three simultaneously — might be advised away from a combination of courses that could be toxic for him or her. By better balancing the course selection, the problem — which would not necessarily be the challenge of a given course — could be solved before it begins. In addition, an institution may find that for a cluster of students standard “triggers” for intervention are meaningless. We’ve seen institutions that are serving military officers who have stellar completion and grade patterns over multiple semesters; however, because of the challenges of their day jobs, regular attendance patterns are not the norm. A generalized predictive model that pings instructors, advisers, or automated systems to intervene with these students may be simply annoying a highly capable student and/or wasting the time of faculty and advisers who are pushed to intervene.

Second, these tools have to be studied and tuned to better understand and maximize their positive impact on diverse student populations. With large-scale predictive flow models of student progression and propensity-score matching, for example, we can better understand how these tools contribute to long-term student success. Moreover, we can do tighter testing on the impact of user-interface design.

Indeed, we have a lot to learn about how we bring the right data to the right people – students, faculty, and advisers — in the right way. A red traffic light flashing in the face of a first-generation student that says, “You are likely to fail” might be a disaster. It might just reaffirm what he or she feared all along (e.g., “I don’t belong here”) and lead to dropping out. Is there a better way to display the data that would be motivating to that student?

The chief data scientist at our company, David Kil, comes from the world of health care, where they have learned the lessons of the impact of lifespan analysis and rapidly testing interventions. He points out the importance of knowing both when to intervene and how to intervene. Moreover, they learned that sometimes data is best brought right to the patient in an app or even an SMS message, other times the message is better sent through nurses or peer coaches, other times a conversation with a physician is the game changer. Regardless, testing the intervention for interface and impact on unique patient types, and its impact on long-term health, is a must.

The parallel in education is clear: Signals was an important first step to break the data wall and bring more focus to the front lines. However, as Campbell notes, if we want these kinds of tools to become more useful, we need to design them with triggers and tools grounded in truly predictive models and create a large-scale community of practice to test their impact and utility with students, faculty, and advisers – and their long-term contribution to retention and graduation. Moreover, as Mike Caulfield notes, technology-assisted interventions need to be put in the larger context of other interventions and strategies, many of which are deeply personal and/or driven by face-to-face work in instruction, advising, and coaching. Indeed, front-line apps at their best might make the human moments in education more frequent, informed, and meaningful. Because, let’s be clear about it, students don’t get choked up about apps that changed their lives at graduation.

Mark Milliron is Civitas Learning's chief learning officer and co-founder.

Section: 
Editorial Tags: 

MOOC research conference confirms commonly held beliefs about the medium

Smart Title: 

At a conference on MOOC research, speakers back up commonly held beliefs about the medium with data.

Online Education: More Than MOOCs

"Online Education: More Than MOOCs" is a collection of news articles and opinion essays -- in print-on-demand format -- about the many forms of online learning that continue to develop outside the white-hot glare of hype surrounding massive open online courses. The articles aim to put recent developments in online education into long-term context, and the essays present the timely thinking of commentators about experts about how distance education is affecting learning and colleges' business models.

The goal is to provide some of Inside Higher Ed's best recent material (both news articles and opinion essays) in one easy-to-read place. Download the booklet here.

This is the fourth in a series of such compilations that Inside Higher Ed is publishing on a range of topics. On January 8, 2014, Inside Higher Ed will offer a free webinar in which Editors Scott Jaschik and Doug Lederman will discuss these issues. You can register for the webinar here.

 

Davidson Faculty Will Use MOOC to Help AP Students

Professors at Davidson College, working through the MOOC provider edX and the College Board, are going to start to prepare online tutorials in select topics in the Advanced Placement program, The New York Times reported. The goal is to create units covering specific topics within AP courses that may be tripping up students. The effort will start with AP courses in calculus, physics and macroeconomics.

 

Essay asks whether alt-ac careers are really a solution to academic jobs shortage

"Alt-ac" positions may be great for some, but Miriam Posner wonders if they are being oversold as a solution to the shortage of academic positions.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Professors who dislike online learning are not Luddites (essay)

A recent article in The Economist, “Learned Luddites,” described liberal arts instructors who refused to adopt MOOCs as “Luddites,” a term made famous in the 19th century by English textile workers who were so paranoid that machinery would replace their jobs that they took to the task of physically destroying the machines they used. To conclude there is a connection between what the Luddites did and the arguments against online learning is reaching, if not absurd, and devalues the discussion happening in academic departments nation wide.

In America, after the launch of Sputnik in 1957 and the creation of the National Defense Education Act of 1958, emphasis was placed on math, science, and foreign language studies, as these three disciplines were deemed crucial to national security. Move forward 10 years and by the late 1960s one out of seven Americans was employed in the defense industry, military spending had risen from 1 percent to 10 percent of the gross domestic product, and corporations were increasingly profiting from an infusion of money from government contracts.

At the same time, high debt from domestic spending combined with outside competition from foreign markets was having an affect, and by the mid-1970s America had slipped into post-industrialism as jobs moved away from manufacturing toward more office based and service type employment opportunities.

The end result of shifting from assembly line to office tech, resulted in a college degree becoming a necessary component to a career, and as universities and community colleges began to accept more and more applicants, higher education began to trend course loads to part-time instructors.

Today, in 2013, a majority of those teaching in academia are working on a contingent basis. Tenure is nearly nonexistent, and liberal arts professors are being made to feel as though they are simply no more than an application, a helpmate, so to speak, that guides the student along as though they were a navigator steering a ship, following a mapped course not set by them, but by some far-off captain who serves as a default programmer for a higher purpose that is kept hush-hush until the time is right, a captain whose job it is to make sure the cargo arrives on time and without any scuffing from the occasional rogue wave.

At worst, more than a few professors feel they are becoming little more than a retention tool, a gimmick or novelty act whose entire future depends on whether or not one can “get with the program” of algorithmic evaluation, spreadsheet printouts, and constant barrage of software programs designed to make keeping track of grades easier, as if a pen and pad were inherently inferior, and all the while the academic is asked to maintain a classroom atmosphere that is not only educational but also so entertaining that even the most mind-numbing of subjects can compete against the fixative trance of the portable handheld device.

Ironically, the analog education one received before the Digital Age, an educational model that emphasized literature and writing, is admired for its fine attention to detail, as detail is considered to be hallmark of success. Yet that style of learning, though suitable for Fitzgerald and Stein, will not work in world where students are groomed as future customers and national security is commingled with corporate wants that drive the areas of study that schools find most lucrative.

It is pathetically sad to think that a classroom could be reduced to a rectangle screen on a distant wall, or thought to be comparable to that of a interior space where a qualified human stands as the moderator before eyes that are watching. A cold, sterile scene from Orwell's 1984 comes to mind in a world where the educator is 20 miles away and the students are considered close.

As a professor, I am not opposed to online teaching, but I do believe we are losing more than we are gaining from a technological hypnosis that has the potential to reclassify the teacher as a network administrator. I am not a lab rat, nor do I want the classroom considered a lab. Our culture is fascinated with language bewitchment and making the obvious appear novel. Yet at the end of the day the MOOC is still no more than a student interacting with a computer regardless how convenient or user friendly the experience has become.

If our embracing and use of technology becomes more important than our mission to teach, to meet in groups for discussion, or to sit one-on-one with a student seeking guidance, then not only should online education be critically evaluated for its unintended affects but also the very system itself that would interpret skepticism as a regress.

Brooks Kohler is an adjunct instructor with an M.A. in history.

Praise, Criticism, Questions After Udacity 'Pivot'

Sebastian Thrun, founder of the massive open online course provider Udacity, is no stranger to controversy. The Stanford University research professor and Google fellow has previously said higher education in 50 years will be provided by no more than 10 institutions worldwide, and Udacity could be one of them. Thrun dropped another bombshell last week in a profile published in Fast Company, which claimed the “godfather of free online education” had changed course. “The man who started this revolution no longer believes the hype,” the article read. Instead of teaching hundreds of thousands of students in one session, Udacity’s future could look something like the company’s partnership with the Georgia Institute of Technology and AT&T to create a low-cost master’s degree.

In reality, Thrun’s shift is more nuanced. Call it a refinement -- not a loss of faith.

“I am much more upbeat than the article suggests,” Thrun said in an email to Inside Higher Ed. “Over the summer, we had students pay for services wrapped around our open classes, and the results were about [20 times] better when compared to students just taking open MOOCs. We have now built the necessary infrastructure to bring this model to more students, while keeping all materials open and free of charge as in the past.”

Some critics of Thrun’s vision interpreted the article as a bit of poetic justice:

“After two years of hype, breathless proclamations about how Udacity will transform higher education, Silicon Valley blindness to existing learning research, and numerous articles/interviews featuring Sebastian Thrun, Udacity has failed,” wrote George Siemens, associate director of the Technology Enhanced Knowledge Research Institute at Athabasca University. “This is not a failure of open education, learning at scale, online learning, or MOOCs. Thrun tied his fate too early to VC funding. As a result, Udacity is now driven by revenue pursuits, not innovation.”

Beyond schadenfreude, many responses cautioned against taking the profile as a sign that Thrun was abandoning higher education:

“It’s tempting to say good riddance,” wrote Michael Caulfield, director of blended and networked learning at Washington State University at Vancouver.

“Thrun can’t build a bucket that doesn’t leak, so he’s going to sell sieves,” Caulfield wrote. “Udacity dithered for a bit on whether it would be accountable for student outcomes. Failures at San Jose State put an end to that. The move now is to return to the original idea: high failure rates and dropouts are features, not bugs, because they represent a way to thin pools of applicants for potential employers. Thrun is moving to an area where he is unaccountable, because accountability is hard.”

Others said the profile marked a premature obituary for MOOCs, which exploded onto the higher education stage as recently as in 2012:

“After a long period of unbridled optimism and world-changing claims about the transformative potential of MOOCs, journalists are now proclaiming that MOOCs are dead, or at the very least broken,” wrote Shriya Nevatia, an undergraduate at Tufts University. “This is extremely dangerous. Instead of companies taking their ambitious proclamations and working hard to make them true, they say that MOOCs have failed, before they’ve even had a chance.”

Among MOOC skeptics, Audrey Watters, an education writer (and blogger for Inside Higher Ed), cautioned against thinking the format is dead:

“The Fast Company article serves as the latest round in MOOC hagiography: Thrun, the patron saint of higher education disruption,” Watters wrote. “And whether you see today’s Fast Company article as indication of a ‘pivot’ or not, I think it’s a mistake to cheer this moment as Udacity’s admission of failure and as an indication that it intends to move away from university disruption.... So yeah, perhaps it’s easy for many in higher education to shrug and sigh with relief that Thrun has decided to set his sights elsewhere. But if we care about learning -- if we care about learners -- I think we need to maintain our fierce critiques about MOOCs.”

Pearson to report student outcomes, review investments as part of efficacy initiative

Smart Title: 

The online education giant will begin reporting all its student outcomes and review all its major investments as part of a five-year plan to make the company more efficient.

Instead of waiting for lawmakers, IT officials say higher education should lead on privacy rules

Section: 
Smart Title: 

Instead of waiting for lawmakers, IT officials and privacy experts say higher education should lead on privacy rules.

Pages

Subscribe to RSS - techfaculty
Back to Top