techadministrators

Apollo Buys Australian Online College

The Apollo Education Group's global division is buying 70 percent of Open Colleges Australia for $99 million, with additional payments of up to $48 million, the for-profit chain announced Tuesday in a news release. Founded in 1910, Open Colleges offers more than 130 online courses. Company officials hope the Australian institution "provides a platform for Apollo Global to operate and expand in other areas of the region.”

California University Leaders Now Skeptical on Online Solutions

California has been the site of much high-level political excitement about the potential of new models of online education to provide introductory or remedial courses at low cost. But The San Jose Mercury News reported that the leaders of the University of California and California State University Systems -- in a joint appearance Friday -- were skeptical. Janet Napolitano, the UC president, said she thought online education probably wouldn't solve issues related to providing most courses, but could be a useful tool for specialized courses. Timothy White, the Cal State chancellor, meanwhile called the much-debated experiment between San Jose State University and Udacity a failure, the article said. It quoted him as saying: "For those who say, 'Well, Tim, you'll save a lot of money if ... you do more things online,' that's not correct." (A spokeswoman for California State University said Monday that the quotes attributed to White were inaccurate, and that his comments were not about a specific campus.)

 

Survey of distance education provider shows important metrics missing -- or withheld

Section: 
Smart Title: 

A survey of distance education providers shows colleges and universities are failing to track -- or refusing to report -- course completion rates.

Have MOOCs hurt public perception of online education? (essay)

A little over 12 months ago, The New York Times famously dubbed 2012 “The Year of the MOOC.” What a difference 365 little days can make. Here at the back end of another calendar year, we wonder if 2013 might come to be thought of as “The Year of the Backlash” within the online higher education community.

Even Udacity's founder, Sebastian Thrun, one of the entrepreneurs whose businesses kicked off MOOC mania, seems to be getting into the backlash game.

According to Fast Company magazine, Thrun recently made the following observation regarding the evanescent hype surrounding MOOCs and his own company: "We were on the front pages of newspapers and magazines, and at the same time, I was realizing, we don't educate people as others wished, or as I wished. We have a lousy product."

Of course, the hype around this category hasn’t wholly abated. Coursera has just announced another $20 million infusion of venture capital. And MIT has just released a report embracing the disaggregation of the higher education value chain fomented by platforms such as edX.

But maybe Thrun is right. Maybe MOOCs are a lousy product – at least as initially conceived. And even if MOOCs are meaningfully reimagined, the mark they have made on the public consciousness to date could have lasting repercussions for the broader field of online learning.

It seems like only last year (in fact it was) that some were crediting elite institutions with “legitimizing” online learning through their experimentation with MOOCs. But what if instead of legitimizing online learning, MOOCs actually delegitimized it?

Perhaps this is why, currently, 56 percent of employers say they prefer an applicant with a traditional degree from an average college to one with an online degree from a top institution, according to a Public Agenda survey undertaken earlier this year.

We’ve been following online learning for a long time, and collectively share experiences in teaching online, earning credentials online, writing about online learning, analyzing the online learning market, and serving as administrators inside a research university with a significant stake in online and hybrid delivery models.

While some MOOC enthusiasts might like you to believe that online learning appeared out of nowhere, sui generis, in 2012, the reality is that we’ve been bringing courses and degree programs online for more than 20 years. Hardly born yesterday, online learning has evolved slowly and steadily, taking these two decades to reach the approximately one-third of all higher education students who have taken at least one online course, and serving as the preferred medium of delivery for roughly one-sixth of all students. The pace of adoption of online learning – among institutions, students, faculty, and employers – has been remarkably steady.

The advent of this so-called “lousy product” – the MOOC – may be triggering a change, however. Indeed, recent survey evidence suggests that the acceptance of online learning among certain constituencies may be plateauing. Is it possible that a backlash against MOOCs could even precipitate a decline in the broader acceptance of online learning?

The long-running Babson Survey Research Group/Sloan-C surveys show relatively little change in faculty acceptance of online instruction between 2002, when they first measured it, and the most recent survey data available, from 2011. The percentage of chief academic officers that indicated they agreed with the statement “faculty at my school accept the value and legitimacy of online education” only grew from 28 percent in 2002, to 31 percent in 2009, and 32 percent in 2011. According to a more recent Inside Higher Ed/Gallup survey, “only one in five [faculty agree] that online courses can achieve learning outcomes equivalent to those of in-person courses.”

We have to be careful making comparisons across surveys, audiences and time spans, of course. But there is a palpable sense here that something may have shifted for online learning in the last year or so, and that as a result of that shift, online learning may be in danger -- for the first time in some 20 years -- of losing momentum.

In recent months, we’ve witnessed faculty rebelling against online learning initiatives at institutions as diverse as Harvard, Duke, Rutgers, and San Jose State, to name a few. In the latter case, faculty rallied to resist the use of Udacity courses on campus, but other instances of resistance did not even pertain to MOOCs – such as Duke’s decision to withdraw from the 2U-sponsored Semester Online consortium, or the vote from Rutgers’ Graduate School faculty to block the university’s planned rollout of online degree programs through its partnership with Pearson.

Our hypothesis is that MOOCs are playing a role here – chiefly by confusing higher education stakeholders about what online learning really is. By and large, of course, online learning isn’t massive and it isn’t open. And by and large, it does actually involve real courses, genuine coursework and assessment, meaningful faculty interaction, and the awarding of credentials – namely, degrees.

In numerous focus groups and surveys we have conducted over the course of 2013, both prospective students and employers have raised concerns about online learning that we had not been hearing in years past – concerns that have been chiefly related to the level of faculty interaction with students, the relationship between quality and price, and the utility of courses that don’t lead to recognized credentials.

The net contribution of the MOOC phenomenon, for the moment at least, may be a backsliding in the general acceptance of online learning – not least among faculty, who may fear they have the most to lose from MOOC mania, especially in the wake of controversial legislative proposals in a variety of states mandating that MOOCs be deemed creditworthy, thereby threatening further public divestment in higher education.

For those of us that have nurtured the growth and strengthening of online learning over many years, this would be an unfortunate outcome of the MOOC moment.

If there is a backlash under way, and if that backlash is contributing to an erosion in the confidence in the quality of online learning generally, that is something that won’t be overcome in a single hype cycle – it will take time, just as the establishment of degree-bearing online learning programs took time to develop and bolster. Possibly even more than one year.

Peter Stokes is vice president of global strategy and business development at Northeastern University, and author of the Peripheral Vision column. Sean Gallagher is chief strategy officer at Northeastern University.

Editorial Tags: 

For information security officers, the battle against breaches is fought on two fronts

Section: 
Smart Title: 

A series of security breaches shows that malicious attacks don't always originate in China. Some are coming from students hoping to cheat.

Essay on the real meaning of the debate over an early notification system for students

Signals has had a rough few months. Blog posts, articles, and pointed posts on social media have recently taken both the creators and promoters of the tool to task for inflated retention claims and for falling into common statistical traps in making those claims. Some of the wounds are self-inflicted — not responding is rarely received well. Others, however, are misunderstandings of the founding goals and the exciting next phases of related work.

Signals is a technology application originally created by a team at Purdue University that uses a basic rules-based set of predictions and triggers — based on years of educational research and insight from the university's faculty — and combines them with real-time activity of students in a given course. It then uses a “traffic light” interface with student that sends them a familiar kind of message:

  • Green light: you’re doing well and on the right track.
  • Yellow light: you’re a little off-track, you might want to think about x,y, or z or talk with someone.
  • Red light: you’re in trouble. You probably need to reach out to someone for help.

These same data are also shared with faculty and administrators so they can reach out to those who need specific support in overcoming an academic challenge or just extra encouragement. Signals is now part of the services offered by Ellucian, but just about all the major players in education technology offer some version of "early warning" applications. In our insight and analytics work, a number of colleges and universities are piloting related apps; however, the promise and problems of Signals are an important predicate as we move forward with that work.

The Signals app project began with a clear and compelling goal: to allow students, faculty, and advisers access to data that might help them navigate the learning journey. For too long, the key data work in education has been focused on reporting, accreditation, or research that leads to long reports that few people see and are all too often used to make excuses, brag, blame, or shame. More problematic, most of these uses happen after courses are over or worse, after students have already dropped out.

The Signals team was trying to turn that on its head by gathering some useful feedback data we know from research may help students navigate a given course, and to give more information to faculty and administrators dedicated to helping them in the process. The course-level outcomes were strong. More students earned As, fewer got Fs, and the qualitative comments made it clear that many students appreciated the feedback. A welcome “wake-up call,” many called it.

John Campbell, then the associate vice president of information technology at Purdue, was committed to this vision. In numerous presentations he argued that “Signals was an attempt to take what decades of educational research was saying was important — tighter feedback loops — and create a clean, simple way for students, faculty, and advisers to get feedback that would be useful.”

Signals was a vital, high-profile first step in the process of turning the power of educational data work toward getting clean, clear, and useable information to the front lines. It was a pioneer in this work and should be recognized as such. The trouble is the conflation of this work with large-scale retention and student success efforts. Claiming a 21 percent long-term retention lift, as some at Purdue have, is a significant stretch at best. However, Signals has shown itself to be a useful tool to help students navigate specific courses, and for faculty and staff striving to supporting them. And while that will likely be useful in long-term retention, there is still much work to be done to both bring Signals to the next level of utility in courses and to test its impact on larger student success initiatives.

First, as Campbell, now CIO at West Virginia University notes, Signals has to truly leverage analytics. In our recent conversation he posited, “The only way to bring apps like Signals to their full potential, to bring them to scale, to make them sustainable is through analytics.” Front-line tools like Signals have to be powered by analyses that bring better and more personalized insight into individual students based on large-scale, consistently updated, student-level predictive models of pathways through a given institution. Put simply, basing the triggers and tools of these front-line systems on blunt, best-practice rules is not personalized, but generalized. It’s probably useful, but not optimal for that individual student. There needs to be a “next generation” of Signals, as Campbell notes, one that is more sophisticated and personalized.

For example, with a better understanding of the entire student pathway derived from analytics anchored on individual-level course completion, retention, and graduation predictions, a student who was highly likely to struggle in a given course from day one — e.g., a student having consistent difficulty with writing-intensive courses who is trying to take three simultaneously — might be advised away from a combination of courses that could be toxic for him or her. By better balancing the course selection, the problem — which would not necessarily be the challenge of a given course — could be solved before it begins. In addition, an institution may find that for a cluster of students standard “triggers” for intervention are meaningless. We’ve seen institutions that are serving military officers who have stellar completion and grade patterns over multiple semesters; however, because of the challenges of their day jobs, regular attendance patterns are not the norm. A generalized predictive model that pings instructors, advisers, or automated systems to intervene with these students may be simply annoying a highly capable student and/or wasting the time of faculty and advisers who are pushed to intervene.

Second, these tools have to be studied and tuned to better understand and maximize their positive impact on diverse student populations. With large-scale predictive flow models of student progression and propensity-score matching, for example, we can better understand how these tools contribute to long-term student success. Moreover, we can do tighter testing on the impact of user-interface design.

Indeed, we have a lot to learn about how we bring the right data to the right people – students, faculty, and advisers — in the right way. A red traffic light flashing in the face of a first-generation student that says, “You are likely to fail” might be a disaster. It might just reaffirm what he or she feared all along (e.g., “I don’t belong here”) and lead to dropping out. Is there a better way to display the data that would be motivating to that student?

The chief data scientist at our company, David Kil, comes from the world of health care, where they have learned the lessons of the impact of lifespan analysis and rapidly testing interventions. He points out the importance of knowing both when to intervene and how to intervene. Moreover, they learned that sometimes data is best brought right to the patient in an app or even an SMS message, other times the message is better sent through nurses or peer coaches, other times a conversation with a physician is the game changer. Regardless, testing the intervention for interface and impact on unique patient types, and its impact on long-term health, is a must.

The parallel in education is clear: Signals was an important first step to break the data wall and bring more focus to the front lines. However, as Campbell notes, if we want these kinds of tools to become more useful, we need to design them with triggers and tools grounded in truly predictive models and create a large-scale community of practice to test their impact and utility with students, faculty, and advisers – and their long-term contribution to retention and graduation. Moreover, as Mike Caulfield notes, technology-assisted interventions need to be put in the larger context of other interventions and strategies, many of which are deeply personal and/or driven by face-to-face work in instruction, advising, and coaching. Indeed, front-line apps at their best might make the human moments in education more frequent, informed, and meaningful. Because, let’s be clear about it, students don’t get choked up about apps that changed their lives at graduation.

Mark Milliron is Civitas Learning's chief learning officer and co-founder.

Section: 
Editorial Tags: 

MOOC research conference confirms commonly held beliefs about the medium

Smart Title: 

At a conference on MOOC research, speakers back up commonly held beliefs about the medium with data.

Online Education: More Than MOOCs

"Online Education: More Than MOOCs" is a collection of news articles and opinion essays -- in print-on-demand format -- about the many forms of online learning that continue to develop outside the white-hot glare of hype surrounding massive open online courses. The articles aim to put recent developments in online education into long-term context, and the essays present the timely thinking of commentators about experts about how distance education is affecting learning and colleges' business models.

The goal is to provide some of Inside Higher Ed's best recent material (both news articles and opinion essays) in one easy-to-read place. Download the booklet here.

This is the fourth in a series of such compilations that Inside Higher Ed is publishing on a range of topics. On January 8, 2014, Inside Higher Ed will offer a free webinar in which Editors Scott Jaschik and Doug Lederman will discuss these issues. You can register for the webinar here.

 

Millions Affected by Maricopa Security Breach

IT security problems in the Maricopa County Community College District may have put the personal information of almost 2.5 million students, employees and suppliers at risk, the institutions warned on Wednesday. 

Federal law enforcement alerted the district to the problems in April, setting off a review that would eventually unearth vulnerabilities that exposed "sensitive information including individual names, dates of birth, Social Security numbers and bank account information, but not credit card information or health records." The district is not aware of any actual security breaches, however.

MCCCD, which consists of 10 colleges in the greater Phoenix region, has partnered with Kroll Advisory Solutions, a cybersecurity company, to address the vulnerabilities. The district may also replace employees that "did not meet the district’s standards and expectations," according to a press release.

Ad keywords: 

Rejecting the for/against dichotomy about online learning (essay)

An article in these pages last week, "We Are Not Luddites," by Brooks Kohler, argues that being skeptical of online learning does not make one a Luddite.

Very well, then. I think most academics would agree. If his article had gone on to critique the tendency of tech folks to alienate skeptics of online learning by labeling them backward or hopelessly outdated, I would have been on board.

But Kohler takes a curious turn when he writes that liberal arts instructors who welcome online learning are in a state of “technological hypnosis.” Students, according to Kohler, are in a “fixative trance.” Apparently digital technology is a dangling medallion swinging back and forth, and we are all getting very, very sleepy.

Kohler goes on to describe a “pathetically sad” scene in which “a classroom could be reduced to a rectangle (sic) screen on a distant wall, or thought to be comparable to that of a interior space where a qualified human stands as the moderator before eyes that are watching.” Online learning to Kohler is inherently dystopian, akin to Orwell’s 1984, while the face-to-face classroom is, in contrast, natural and human.

This conversation calls to mind Plato’s Phaedrus. In this dialogue, Socrates laments the technology of writing because he fears it will diminish memory skills if Athenian citizens no longer have to memorize and practice oral discourse.

Worse yet, writing is inferior to speech, according to Socrates, because we can’t argue with a piece of paper like a living person; writing only has the appearance of wisdom, not wisdom itself.

Frankly, I’m not interested in reinforcing such a strict for/against dichotomy when discussing online learning and new digital technologies.  I think such binary thinking is part of the problem.

I teach face-to-face, online, and blended sections of composition at a small rural state university and I see strengths and limitations in all three approaches. My online classes look nothing like Kohler’s panoptic nightmare. Or, at least, I hope they do not -- now that I think of it, perhaps students calling me Big Brother isn’t a term of endearment after all.

Kohler does not take kindly to being called a Luddite, yet he suggests teachers and students working hard to make online learning rigorous, academic and accessible are hypnotized dupes attracted to shiny surfaces and entranced by blinking lights. Worse yet, he charges that online learning encourages contingent academic labor and the demise of tenure-track positions when in fact this erosion has been a decades-long process with roots extending long before online learning.

Notice I’ve been using the term “online learning” and not “MOOCs,” the latter against which I harbor a much deeper skepticism, but that’s a story for another time. I highlight this distinction because a sleight of hand occurs when Kohler begins his article by discussing MOOCs only to substitute that digital phenomenon with a more generalized “online learning” later in the same paragraph.

I’m not just splitting hairs. MOOCs and online learning are too often conflated. They are, of course, not the same thing. Suggesting otherwise is merely shoving stuffing into a straw man. The problems of MOOCs do not automatically extend to online learning in general.

A similar game of three-card monte is performed when Kohler uses a generalized “technology” when he really means new digital technologies. This slippage leads to historical and theoretical quandaries.

For example, when Kohler chortles “as if a pen and pad were inherently inferior” he fails to recognize that pen and paper are technologies, and that writing itself is a technology, as Walter Ong famously argued. Conflating new digital technologies that facilitate online learning with technology in general results in a fixed, narrow, and uncomplicated definition of technology.

Again, this isn’t academic hair-splitting. Such a distinction is helpful because it leads our dialogue away from dystopic visions and forces us to confront the fact that even analog technology like Kohler’s “pen and pad” shape how and what we learn.

Because teachers believe that online learning can be a worthwhile experience does not mean that we are hypnotized, nor does it mean that we are chasing fads and abandoning “literature and writing” and a “fine attention to detail,” as Kohler claims.

Instead of charging one another as either entranced by new technologies or a Luddite, we should be cultivating dialogue, criticism and best practices to make online education better.

We should also pay more attention to issues of race, class and access when it comes to online learning. And we should be building space and time into our online courses for students to reflect on their own skepticism and concerns with digital learning. Including students in this dialogue is essential.

I too am skeptical of online learning. However, this skepticism does not lead me away from online teaching, but toward it. I want to make it better. I believe it’s our duty to make it better. Drawing broad caricatures of online teachers and students only reinforces the importance of not devolving into a strict for/against dichotomy in our dialogue.

John F. Raucci Jr. is an assistant professor of English at Frostburg State University.

Editorial Tags: 

Pages

Subscribe to RSS - techadministrators
Back to Top