You have /5 articles left.
Sign up for a free account or log in.

In a thoughtful commentary published in Inside Higher Ed earlier this year, my friend and colleague Lev Gonick, vice president and CIO at Case Western Reserve University, proclaimed that “course management systems are dead; long live course management systems.” This was one of his eleven IT predictions for 2009.

At a time when the Course or Learning Management System (LMS) has become an embedded, if not indeed an essential, element of the college experience for students across all sectors of American higher education, Gonick’s proclamation seemed, at face value, contrarian. But Lev focused his assessment primarily on the fate of proprietary systems (read Angel, Blackboard, and Desire2Learn):

"Proprietary course management systems are heading for a brick wall. The combination of economic pressures combined with saturated markets and the maturing stage of the life cycle of these once innovative platforms means that 2009 may well be the year of change or a year of serious planning for change.

Relatively inexpensive and feature-comparable open source alternatives, combined with some now learned experience in the process of transition from closed to open systems for the inventory of repeating courses, makes real change in this once bedrock of education technology a growing possibility. As product managers and management view these trend lines, I think we might see incumbent players make a valiant effort to re-invent themselves before the market drops out from underneath them. Look for the number of major campuses moving (or making serious threats to move) from closed systems to open ones to climb in the year ahead."

There is no question that the campus community has become increasingly dependent on the LMS to support, supplement -- and at times even shape -- instruction. The LMS is widely deployed across all sectors of higher education. Data from the 2009 Campus Computing Survey indicate that 92 percent of institutions have standardized on a single LMS product for the entire campus; CIOs estimate that as of fall 2009, more than half (55 percent) of classes make some use of a LMS, up from 50 percent in 2007 and 34 percent in 2003.

These data confirm the anecdotal reports that over the past decade the LMS has become a core part of the academic experience for both students and faculty. Concurrent with the rising deployment of the LMS, some campuses have seen the kind of near-religious wars about LMS systems that occasionally rival the passion of the Mac vs. PC debates.

Yet Gonick’s assessment speaks to the volatility in the LMS market as many campuses reassess their LMS strategy. Most campuses have been with their initial LMS provider for a very long time: New data from the recently released 2008 Educause Core Data Survey confirm that for most institutions, the initial selection and implementation occurred early in the current decade.

So while the applications have evolved (and costs have increased), so too have the campus need for and expectations of the LMS risen. Reflecting the predicted volatility, almost half (47 percent) of the 182 campuses that participated in the recently released "Managing Online Education" survey conducted by WCET and the Campus Computing Project indicated that they are currently reviewing their LMS strategy; more than a fourth (28 percent) of the MOE survey respondents report that their campus will likely change the LMS in the next two years.

Yet for all the operational angst (and occasional anger) about the LMS in general (or specific LMS features, systems, or providers in particular), we would do well to remember that these are relatively young applications, barely a decade old at best. Permit me a moment of shameless self-promotion: back in 2004 I wrote that the CMS or LMS arena in higher education was “a mature market with immature products.”

Using the standard metrics of a business school case study, even in 2004 the LMS market was mature, as almost campuses already had an institutional license. (Course deployment, of course, was and remains a different issue.) At the same time these were very young products in 2004. Taken together, the maturity of the market coupled with the immaturity (or, if you prefer, the rapid evolution) of the products pointed to coming transitions (if not turmoil) in the market.

Now, early in the second decade of the campus experience with the LMS, we are at one of those key transition points. Yes, as Gonick commented earlier this year, there is a slow yet clear transition under way from proprietary to open source LMS applications across all sectors of American higher education. Will the proprietary LMS providers vanish in the next five years? Unlikely. But there are clear signs that the competition in the higher education LMS market these days is less about the battle for market share among the remaining proprietary providers and more about proprietary vs. open source options.

Yet the more significant transition in the LMS arena is what I would describe as the arrival of LMS 3.0. And over time this transition is less about code and accompanying costs – proprietary vs. open source – and more about extracting data, information, and insight from the transactional data that capture the student and faculty interactions with the LMS.

In other words, the value of the LMS will increase as it migrates from a resource for content and services into a source for real time data about academic activity and student behavior.

Let’s review. The early CMS/LMS applications -- some available for individual purchase, some licensed to individual departments or across the institution -- focused on a relatively simple goal task: assisting faculty post instructional resources (syllabuses, reprints, etc.) to the Web.

Much like the Apple I computer, LMS 1.0 was short-lived: once we could touch it, we wanted more -- much more! Consequently, LMS 2.0 reflected broader needs and rising aspirations. The migration from LMS 1.0 to LMS 2.0 was not so much about making it easier to post stuff (instructional resources) to the Web (but yes, easier mattered, and still does), but about the actual stuff -- richer, more engaging and iterative content as well as better resources and services (chat rooms, grade books, etc.).

During the comparatively long LMS 2.0 phase we saw added functionality as well as more content, accompanied by content alliances. All the major higher education publishers declared themselves to be LMS friendly. Yet some publishers were friendlier with specific LMS providers than others as they developed marketing alliances, and, at times, even financial relationships with selected LMS firms. Concurrently, content resources such as “course cartridges” that would “feed” or support the LMS often became part of the new ancillaries that publishers provided to support and supplement their textbooks. Some publishers began to offer complete Web sites to support their undergraduate titles.

LMS 3.0 marks the transition from the LMS as an instructional resource and service for students and faculty to a key source for critical transactional data about academic interaction and student engagement. And let’s be candid about what this means: although the analogy may be offensive to many in the campus community, the LMS is higher ed’s version of the supermarket scanner. The LMS records and stores valuable data about student interactions with academic resources, much the way the supermarket scanner records my purchases of (and preference for) bananas and dark beer.

The transactional data from the LMS can tell us much about the aggregated and individual student interaction with course content outside of the classroom (or in the case of online courses, away from the chat room!). The transactional data from the LMS -- what students do while “in” the LMS for an individual class and how long they are “in” the LMS -- are the new metrics for student engagement and time on task.

What evidence supports my thesis of the emergence of LMS 3.0? In recent years we’ve begun to see campuses compile the transactional data from their LMS and merge it with other information, typically course grades and demographic data extracted from the Student Information System. Catherine Finnigan at the University of Georgia System and John Campbell at Purdue, among others, have done some of the early and important work using transactional data from the LMS as part of broader efforts to assess the impact of information technology on student learning and outcomes.

But the value of the LMS transactional data goes well beyond variables that might help to assess our investments in information technology (i.e., “does IT make a difference in student learning?”). Just as the supermarket scanner provides data about consumer behavior, so too do the transactional data from the LMS offer great potential to tell us a lot about student behavior -- data that can be used as an institutional resource for program enhancement, and also data that can be used for individual student interventions.

Beyond the work of individual researchers, the market is also sending clear signals of the emergence of LMS 3.0. eCollege, acquired by Pearson in May 2007, has long offered some embedded analytical tools as part of the company’s LMS offering. Blackboard, the dominant provider in the campus LMS market, launched its branded Outcomes System in January 2007. Although the number of institutional licensees is not large (just 32 campuses at year end 2008, according to public data released by Blackboard), Blackboard’s launch of an Outcomes offering reflects what we can infer was the company’s assessment of a need on the part of its campus clients -- “help us with outcomes assessments” -- as well as an opportunity to develop a new commercial product.

Other indicators also affirm the importance that campuses and their IT providers place on extracting value from the transactional data in the LMS. Many campuses are beginning to deploy Business Intelligence (BI) and CRM (Client/Customer Relationship Management) software as analytical resources for student outcomes and retention analyses.

Moreover, the traditional providers of administrative or ERP (Enterprise Resource Planning) applications for postsecondary education, often criticized in the past for the absence of sophisticated analytical tools in their products, are now promoting their alliances with, support for, and integration of LMS and related applications.

Campus Management was perhaps the first ERP provider to announce support for Moodle, an open source LMS. In October, Datatel announced an exclusive alliance with Moodlerooms that will help integrate Moodle with Datatel’s information systems. Finally, SunGard Higher Education’s pre-EDUCAUSE conference announcements – one an alliance with Epsilen, the New York Times-owned company that provides both newspaper archives and ePortfolio applications for the college market, the second with Purdue for the university’s Signals early intervention system -- speak to the rising role of the LMS as a source for valuable data essential to assessment and evaluation analyses and interventions.

The new interest and efforts of ERP providers to exploit the transactional data that resides in the LMS reflect the demise of the “ERP Turtle” -- the data and organizational silos that traditionally marked the separate, usually unconnected functions of the campus administrative system: admissions data, student records, finance, etc. As the LMS has emerged as a de facto element of the enterprise system in recent years, campus officials and their information systems providers have begun to recognize the value of the transactional data compiled by the LMS.

Although Margaret Spellings is no longer the U.S. education secretary, the three-year old Spellings Commission Report5, released in September 2006, still casts a wide shadow over the discussions about institutional effectiveness and student outcomes.

Secretary Spellings would often cite (without attribution) W. Edwards Deming when she told education audiences that “back in Texas we like to say, ‘In God we Trust; all others bring data.’ ” Admittedly the Deming quip was a charming if disarming way to deal with critics of the testing mandates associated with the No Child Left Behind legislation. But the essential truth remains: the campus and public policy discussions about institutional effectiveness are less tolerant of opinion and epiphany, and are increasingly focused on data and evidence.

In this context LMS 3.0 also speaks to the increasingly important role of institutional IT officials in the campus conversations about evaluation, assessment, and outcomes. We now have the analytical tools (business intelligence/analytics, data mining, and data warehousing, among others) to use routine, unobtrusive institutional and transactional data (high school transcripts, students’ test scores, students’ records from ERP modules, transactional data from learning management systems, college/university transcripts, and more) to address the critical assessment and outcomes issues that affect colleges and universities.

For campus IT officials, the issue that emerges in the wake of the Spellings Commission Report about institutional effectiveness, student learning, and student outcomes concerns not if but when college and university IT leaders will assume an active role, a leadership role, in these discussions, bringing their IT resources and expertise -- bringing data, information, and insight -- to the critical planning and policy discussions about institutional assessment and outcomes that affect all sectors of American higher education.

The emergence of LMS 3.0 is one part of this process; LMS 3.0 will be a critical component of the data that will aid and inform these efforts.

Author disclosure: Blackboard, Campus Management, Datatel, Pearson, and SunGard Higher Education are corporate sponsors of The Campus Computing Project.

***

Next Story

More from Views