You have /5 articles left.
Sign up for a free account or log in.

Analytics is hot. And no company is currently hotter than Civitas Learning

The buzz around Civitas in the edtech space is largely attributed to the track record of co-founders Mark Milliron (Chief Learning Officer) and Charles Thornburgh (CEO). 

Mark and Charles are well known in edtech circles for both their entrepreneurial track records as well as their deep connection to the education sector. (Charles was an in-house entrepreneur at Kaplan for many years, while Mark was the founding Chancellor of WGU Texas).

I had the opportunity to chat with both Mark and Charles about the specifics of Civitas Learning, and where higher ed is going in general. What follows is an (edited) Q&A that the 3 of us conducted by e-mail.

Question 1. Is higher education broken? There seems to be a pretty good case to be made that any industry where costs rise at twice the rate of healthcare, and quality (however we measure it), does not seem to be increasing at nearly the same rate, is fundamentally broken. Is there a counter argument?
 
Broken? No. In need of innovation and evolution? Yes. Broken seems like a fairly strong statement and would suggest that most, if not all, higher education institutions are doing something wrong. We would argue that things are changing and there is some adapting that needs to be done across the industry to address both student success rates and costs.
 
What’s encouraging to us is the increasing focus of the institutional leaders we’re working with on achieving their diverse missions in sustainable ways. Indeed, many are trying to innovate toward radical improvement in student success with flat revenues at best. If the academy can be at its best—diving deep into good data with critical thinking, creativity, and thoughtful action — then these institutions will continue to deliver exceptional quality and see a higher percentage of students stay in school and graduate.
 
Question 2. How much does higher education change in the next 10 years? Will colleges go bankrupt? Will students even attend classes? Where will we see the biggest change in higher education? And where will we see the least amount of change?
 
First of all, that’s a LONG time horizon; especially given the changes we’re seeing with the tools and techniques in higher education. Anyone who has the next decade fully scoped is pretty impressive.

What’s safe to say is that over the next ten years, many schools will have no choice but to adapt and evolve the way they think about student success and engagement – from their instructional models to their advising and support strategies to the way they connect students to careers and community.

Where we’ll likely see the biggest change (and we’ve started to see this already) is in the broadening of the term “classroom” beyond just four walls. Teaching and learning options will be more diverse--from traditional to competency-based to blended to fully online to combinations of all of these--and will increasingly leverage technology tools and digital curricular resources. Students will absolutely have more of a family of high-tech, high touch options.

Of course, with the deepening digital footprints of students along their educational journeys, we think analytics will play an even keener role since it will be even more important to match students to optimal learning options, strategically target supports, and to help them take more agency along the way as they use improved feedback and these more personalized learning and support systems to learn well and finish strong.
 
Question 3. It seems that "big data" has entered the hype cycle. Everyone is talking about analytics, and how we are going to use data to improve learning outcomes while lowering costs.  Are we bound to be disappointed in 5 years when we find that the fundamentals of higher education have not changed? That we see a continued rise in costs without substantial changes in 6-year graduation rates? Can you convince our skeptical readers that our “analytics moment” is real and not just the latest education fad?

In many ways, “big data” has entered the hype cycle -- and it’s important to separate the wheat from the chaff. I don’t think many people — in any industry—would argue that better-informed decisions are a bad thing. Indeed, far too many students —especially first-generation students — are flying blind on their higher education pathways. They don’t have the stories of family and friends to help scaffold them. Getting these students or their faculty or advisors solid insight drawn from the millions of stories told in the digital footprints left by the students that came before is probably a good thing.

However, far too many data strategies will end up as little more than expensive edu-voyeurism, focusing high-cost data work on accreditation, trustee updates, or little used reports that essentially watch as students succeed or fail. If that’s where the resources go, we doubt folks will see a return on the investment. But done in a thoughtful way where we gain insights to guide action on the front lines of learning, it can (and will) have tremendous positive impact--and in probably sooner than 5 years.   

Question 4. How would you actually characterize/define “big data” and analytics? Per previous question there’s plenty of discussion around the topics, but are we all speaking the same language when it comes to these topics?

Here’s how we see it: Insight analytics include a family of data science strategies that combine hindsight (e.g., data mining) and foresight (e.g., predictive modeling) to guide initiatives in teaching, learning, student support, and institutional management. The data we pull from various systems at a college or university tell stories of the diverse student journeys through institutions. One student’s story is interesting, but looking at these journeys in an ongoing systematic way involving thousands of students and millions of student records is transformative. Standing on this foundation, we can employ action analytics that use insight to fuel the right interventions at the right time with the right student along each individual’s learning journey. Put simply, our work with partner institutions has taught us that action fueled by insight is essential to helping students succeed.

Question 5. When talking about data and analytics, there are multiple audiences to consider. How will these two concepts – in the best possible scenario – impact faculty? Administrators? Students?

It’s really about making the human moments precious for all involved. With useful data on the front lines, students can, in the worst-case scenario, realize their situation and assist with their own rescue. In the best cases, they can chart an informed and ambitious learning path for their future.

Faculty can use data on the front lines to determine what, among the curricula, is best suited to each modality, from online, to group learning, digital to live delivery, based on understanding the how, why and when of student engagement within each learning opportunity. Ultimately, it’s all about creating an empowered student body that works with their faculty and administrators to learn deeply and emerge as successful graduates.
 
Question 6. We often hear (and I’ve often said) that for higher ed to achieve productivity gains that we will need to change some of standard operating procedures. What would that sort of change actually mean? What role should partnerships between not-for-profits and for-profit companies play? Where can we take costs out while improving outcomes?  

As we’ve mentioned earlier, one obvious early win to reduce costs in the long run would be to make better informed decisions and actions toward improving outcomes, increasing the number of graduates while decreasing the time to completion. 

Retention has a huge ROI for universities and for students. I don’t know a university or college that isn’t trying to increase persistence, it’s just for several it’s hard to know which initiative, which intervention is really moving the needle on effectively improving student retention, persistence and success. 

Thoughtfully designed predictive analytics platforms and apps can take the guess work and the associated costs of “bowling in the dark” out of the equation so universities and colleges can allocate funds to the exact point and program that is going to move that student, at that institution, over the hurdles and loss points to completion. 

These kinds of partnerships - like those our community of colleges and universities are building with us - allow them to maximize their available (and in some case dwindling) resources, learn together what’s working, and provide students and faculty with the data and tools they need to make informed choices that result in higher successful outcomes.

Question 7. Technology, in and of itself, is not the answer. But we have access to some great tools to help educators and students be more successful in the classroom. So how do we inject the “human element” into the decision making process and the discussion so it’s not just about technology for technology’s sake? In other words, how would you convince an administrator or faculty member that ANY sort of technology investment is a good idea?

It’s always about putting technology in its place. It’s a tool that can be used by educators to help students learn well and complete their education journeys. Indeed, technology--analytics in particular--may be at its best when it helps us better understand “when” a student would be particularly receptive to personal outreach, or when personal outreach is crucial. It's not just about understanding patterns of performance, but also understanding the best "human" moments to intervene.

Moreover, technology tools and analytics may point to when a strong student is primed to be pushed by a personal conversation with a key influencer (near-peer mentor or maybe a mentor faculty member). Also, faculty leveraging analytics may learn from data flow (e.g., the engagement data) that some concepts in courses are best taught in groups, with peers, or through dialogic or Socratic methods—i.e., they are best delivered with some form of human interaction, not digital courseware or traditional lecture.

And finally, creating the “space and grace” for deeper conversations with students about their purpose and choices through advising is essential. In other words, let analytics, software and technology handle the pieces of advising that can be handled that way so it creates more time for student and advisor to connect on a deeper level. Learning together about how, when, and when not to use technology as part of the mix is vital -- it’s why we named our company Civitas Learning. Civitas means community or citizen; we want to be a community learning together in this work.
 
Question 8. The whole idea of metrics and measuring impact is going to be critical as we move forward implementing systems that leverage analytics in some form or fashion. How do we know we are being successful? How do we measure success?

There are multiple ways to measure success, but a big-picture example for us is what we call our Million More Mission. We want to narrow the gap in the number of graduates our country needs in order to compete and to improve the prospects of striving students in higher education. The United States must produce roughly one million more graduates a year by 2025 to ensure the country has the skilled workers and entrepreneurial leaders it needs -- not to mention more deeply engaged and informed citizens. 

Of the 30 million students in higher education today, about half will graduate, with huge disparities based on the income levels of incoming students. As daunting as the challenge of one million more graduates sounds, we can reach the goal together by improving outcomes for current students, and find creative new pathways for returning students. Analysts argue that reaching this goal would increase today’s annual output of associate and bachelor’s degree-holders by about 3.5 percent a year for the next decade. That’s achievable.

For individual institutions, they typically are using more discrete measures related to their particular mission: (1) course completion, (2) year-to-year retention, (3) completion rates, and/or (4) job placement rates. Many are also attempting to get clearer on the achievement of learning competencies. An increasing number are also being pushed toward specific measures that are being built into funding models, which include many of the above and targeted outcomes around specific programs--dev ed or STEM.

Regardless, the institutions that seem to be making the most progress are taking on the tough conversation about which metrics matter to them, that match their mission, and are getting aggressive and smart of measuring progress against them.

Next Story

Written By

More from Learning Innovation