• Digital Tweed

    Digital Tweed® is the work of Kenneth C. Green, founding director of The Campus Computing Project. If successful, these posts will inform and entertain, and at times also annoy. A little dissonance can be a good thing.

Title

Still Not Using Data to Inform Decisions and Policy

Has your institution increased its investments in analytics in recent years? How’s that working out for you? Two recent surveys suggest that for many institutions, the investment is not working out very well.

February 25, 2020
 
 

The past decade has seen increased campus discussions about and investments in analytics. Prompted in part by the activities in the consumer and corporate sectors, institutional leaders up and down the campus organization chart are increasingly engaged in conversations about leveraging data and analytic tools to inform institutional planning and policy making, enhance student learning, improve retention and graduation rates, and improve campus operations and services.

Fueling the campus conversations (and expectations) about analytics are the accelerating external pressures for improved institutional performance (e.g., student success and graduation), as well the presentations (and promises!) of technology and analytics providers. The various firms that provide administrative/ERP systems and LMS applications have increased their efforts to promote analytic resources to their campus clients. Too, we’ve also seen the emergence of a new category of “analytic middleware providers” -- firms that promise to provide programmatic solutions that emerge from their “special analysis” of student and institutional data drawn from a variety of core but often incompatible data resources that contribute to “data babel” at many institutions.

Admittedly, there are some impressive success stories from campuses that have invested in analytics. Perhaps the most striking involves Georgia State University, which has effectively eliminated the historical (and significant) differences in retention and graduation rates among its various student populations. As reported by The New York Times in 2018, the university “reshaped itself amid a moral awakening and a raft of data-driven experimentation.” Over a decade, the graduation rate for all GSU undergraduates increased an impressive 23 percentage points, rising to 55 percent in 2018. Particularly impressive is that baccalaureate completion rate for African American men is now 55 percent, up from 18 percent in 2003.

Another example of analytics, this one focused on gateway courses, comes from the John N. Gardner Institute for Excellence in Undergraduate Education. As described in a recent Chronicle of Higher Education article, the institute works with client campuses to analyze data and implement evidence-based redesign initiatives for key gateway courses as part of campus efforts to reduce DFWI rates and improve retention among first-year students. Preliminary data from 13 campuses working with the institute “shows higher grades, pass rates, and retention rates among students in revamped courses compared with students in course sections that did not go through the reform process” according to Gardner Institute president Drew Koch.

Admittedly, there are also other impressive examples where analytics has helped inform planning, programs and policies, leading to major gains in retention and graduation rates across student demographic group.

And yet despite these emerging and well-documented success stories about using data and analytics to improve academic programs and services and to inform planning and decision making, two recent surveys point to analytic angst among CIOs and CAOs. By analytic angst, I mean that despite the rising institutional investment in analytics, only a small proportion of senior campus officials assess their campus investments in analytics as “very effective” or report that their institution does a “very effective” job of “using data to aid and inform campus decision making.”

Let’s begin with senior campus IT officers and data from the fall 2019 Campus Computing Survey. While three-fifths (60 percent) of the surveyed CIOs and senior campus IT officials identified data analytics as a “very important” institutional priority, less than a fourth (22 percent) assessed their institution’s IT investment in analytics as “very effective.” The proportion of IT officers assessing their institution’s investment in analytics has hovered around 20 percent for the past five years. In contrast, in 2019 half the survey participants (48 percent) rated the campus IT investment in on-campus teaching and instruction as “very effective,” and just over two-fifths (44 percent) offered a similar assessment about IT investments to support library resources and services.

Turning to provosts and chief academic officers, Inside Higher Ed’s January 2020 CAO survey also offers some striking data about the institutional inability to use data to aid and inform institutional decision making. Just a fourth (23 percent) of the surveyed CAOs assessed their institution as “very effective” in “using data to aid and inform campus decision making.” Moreover, the CAO number has plunged over the past eight years, from 31 percent in 2012 to 23 percent in 2020.

Despite rising institutional investments over the past decade, these data provide a candid -- and clearly disappointing -- assessment of the impact of analytics at many campuses.

The key question is why the low numbers from CAOs and CIOs on the impact of analytics and campus efforts to make effective use of data for planning and decision making?

The answer, I believe, is actually found in the examples from Georgia State, the Gardner Institute and some other institutions that have launched successful analytic initiatives. These efforts have been successful because they have looked beyond the numbers to develop and invest in programmatic efforts that transform data into information, insight and innovative initiatives.

Yes, ’tis true that the intense and well-designed analytic work at Georgia State identified some 800 separate factors that could affect academic progress and student success. But while the “top-level” summaries of the Georgia State story have often focused on analytics, essential to the university’s improved graduation numbers was the addition of some 180 academic support personnel to work with at-risk students. More than just receiving an automatic, analytics-driven (and potentially stigmatizing) email informing a student that he or she might be at risk based on a range of metrics and behaviors, that email also provided a path out to connect the individual student with academic advisers and critical support services.

Similarly, the Gardner Institute’s work draws on analytics to develop a “morning-after” strategy focused on course redesign: the emphasis is not on the numbers, but on using data to inform and improve gateway courses, leading to higher completion rates and lower DFWI numbers.

In other words, analytics only work if the analytic work is part of a larger and well-understood gestalt: the sum (improved student outcomes) is more than the parts (analytics and other unconnected efforts). Absent an institutional (or departmental) commitment to evidence-based, well-designed and well-resourced intervention strategies and support services, the investment in analytics is almost certain to fall short of expectations.

Too, part of the new conversation about analytics is that we must change the data culture in higher education. For too long data have been used as a weapon to document what was done wrong or why an initiative failed. Rather than using data as a weapon, we must (like Georgia State) use data as a resource, focusing on how we can -- and will -- do better.

And a key part of the new “data as a resource” strategy requires that we plan for assessment as we develop new programs and initiatives, not as a post-hoc afterthought. We must build in the plans for useful data and appropriate assessment as part of program design, rather than scrambling for data to support assessment after programs are launched.

Like many others, I have long argued that campus planning and policy discussions must migrate from opinion and epiphany to data and evidence. My hope is that the CAO and CIO data cited here suggest the bottom of the curve and not the continuing state of affairs.

We can -- and must -- do better.

Disclosure: I am an unpaid fellow of the John N. Gardner Institute.

Additional References

Be the first to know.
Get our free daily newsletter.

 

Back to Top