You have /5 articles left.
Sign up for a free account or log in.

iStock

Experiments with adaptive learning at 14 colleges and universities have found the software has no significant average effect on course completion rates, has a slight positive effect on student grades and does not immediately lead to lower costs. And after using the software for three academic terms, less than half of the instructors involved say they will continue to use adaptive courseware.

“The ultimate goal -- better student outcomes at lower cost -- remains elusive,” the report on the findings, compiled by SRI Education, a division of the research and development firm SRI International, reads. That conclusion is far removed from the praise and excitement about adaptive learning -- software that changes based on if students get questions right or wrong -- that one hears at ed-tech conferences and elsewhere.

Still, market analysts and officials at the Bill & Melinda Gates Foundation, which funded the experiments and the study, said it is too early to draw conclusions about the “nascent” adaptive learning market. While the top-level findings may not provide many answers about the efficacy of adaptive learning, they pointed out that the report contains potentially promising findings for students at two-year colleges and in remedial courses.

“There’s no magical silver bullet here, but in the spirit of improving the field, we think this data and learning is really quite valuable and that the field has a responsibility to build on top of that,” said Rahim Rajan, a senior program officer for Gates. “There’s an imperative nationally for colleges and universities to take these kinds of experiments further.”

The report, published this spring, marks the conclusion of the Adaptive Learning Market Acceleration Program (ALMAP), a market acceleration program that from summer 2013 through winter 2015 funded 14 two- and four-year colleges to test adaptive learning software in 23 blended and fully online remedial and introductory courses and compare the results to sections that didn't use the software.

“We believe that well-implemented personalized and adaptive learning has the potential to dramatically improve student outcomes,” the foundation said in March 2013. “Our strategy to accelerate the adoption of adaptive learning in higher education is to invest in market change drivers … resulting in strong, healthy market growth.”

In addition to boosting market adoption of adaptive learning, the program explored whether the software could have an effect on the factors that make up the “iron triangle” of higher education -- access, costs and quality.

Each college received a $100,000 grant to participate in the program. They selected adaptive learning software from major providers in the market -- including Adapt Courseware, Knewton, McGraw-Hill Education, Pearson, Smart Sparrow and others -- then put it to use in 15 introductory and seven remedial courses. Some instructors turned traditional lecture courses into blended classrooms, others took blended or fully online courses and added adaptive elements.

In total, the program involved more than 280 instructors teaching more than 19,500 students enrolled in 10 bachelor’s and four associate degree programs. Of the 23 courses, seven were in English, six in mathematics, four in biology, four in social science and two in business.

The report paints an inconclusive picture of the effect adaptive learning has on completion rates, costs and grades. In 11 of 15 courses, adaptive courseware had no impact on student grades. Students in the remaining four courses earned only “slightly higher” average grades. In no case did the researchers find evidence of the software affecting completion rates.

In the seven cases for which the researchers were able to collect enough data for side-by-side comparisons of learning assessment scores, however, they found that adaptive courseware led to “modest but significantly positive” results.

Findings on the ability of adaptive courseware to lower costs were similarly inconclusive. Course costs in most cases increased the first time instructors started using adaptive learning software, though when the instructors offered the courses a second and third time, costs fell in seven out of 10 cases.

Surveys of how faculty members and students felt about using adaptive software revealed gaps between students and instructors in remedial and introductory courses as well as between students at two- and four-year colleges.

Overall, about three-quarters of instructors (76 percent) said they were satisfied with the software, though faculty members teaching introductory courses were split on the issue -- 49 percent said they were satisfied, compared to 67 percent of remedial course instructors.

Students in remedial courses said they were highly engaged (60 percent) by the adaptive courseware, and nearly all of them saw learning gains (95 percent). Engagement was significantly lower among students in introductory courses (25 percent), as was the share of students who said they saw learning gains (35 percent). While about half (56 percent) of students in associate degree programs said they were satisfied with adaptive courseware, only one-third (33 percent) of students in bachelor’s degree programs said the same.

The report suggests the low satisfaction rating among students at four-year colleges may be the reason why less than half of surveyed faculty members said they plan to keep using adaptive learning software in their courses.

Yvonne Belanger, also a senior program officer for Gates, said the report helps the foundation understand the settings in which adaptive learning can make a difference.

In addition to the higher level of enthusiasm at two-year colleges, Belanger highlighted the finding that students who received Pell Grants did not score any differently than other students. Other projects that have explored whether online learning can help disadvantaged students -- for example an experiment with massive open online courses at San Jose State University -- have found those students sometimes struggle without the support they receive in a face-to-face setting.

“What we saw is there were a variety of faculty experiences,” Belanger said. “The report surfaced that there are areas where the technology seems to be further along and settings … where the technology did seem to have more impact.”

The findings are further diminished by data collection issues. Several colleges were unable to provide data on student outcomes and ongoing costs that meet SRI’s criteria for inclusion in the analysis. There was, for example, not enough data to compare outcomes in regular blended courses to outcomes in blended courses that used adaptive software. And in some cases, instructors "reported challenges in getting students to use the courseware frequently enough to achieve benefits."

Belanger said some colleges had “more capacity to participate than others.” Two-year institutions, in particular, had a more challenging time collecting the right data -- yet the foundation needed to partner with a broad range of colleges to test the efficacy of adaptive learning in different settings and with different student populations, she said.

“We learned a lot about what it takes to support colleges, what we need to do as a funder,” Belanger said.

Gates has a related follow-up program already in the works. The Next Generation Courseware Challenge, announced in September 2014, brings back many of the vendors involved in the Adaptive Learning Market Acceleration Program to develop digital course materials aimed at helping disadvantaged students succeed in college.

Adam Newman, founder of the strategy consulting and investment banking firm Tyton Partners, said the “jury is still out” on adaptive learning. Like MOOC providers, he said, companies behind adaptive learning software also need time to experiment and adjust to find products and business models that work for them and their customers.

The report therefore represents a “thoughtful effort” by “early-stage businesses with nascent commercial applications to try to create some evidence,” he said, and the results establish a baseline that can help vendors further develop their products and colleges as they consider how to use adaptive courseware.

“All of these are great findings that the believers will build on and the haters will hew to as reasons why not to do it,” said. “You can’t brush it under the rug, and you shouldn’t use it as evidence that adaptive learning doesn’t work.”

Next Story

More from Teaching