Competency-based learning

Competency-based education requires instructors to alter teaching approach

In competency-based formats, instructors adjust to interacting regularly with students, directing students toward clear learning outcomes and other departures from their traditional practices.

Slow growth for competency-based education, but survey finds interest and optimism about it

While competency-based education is spreading gradually, interest and optimism about it remain high, and experts say careful growth is best.

Competency-based programs offer flexible learning in variety of models

Three online or hybrid CBE programs reflect the diversity of approaches to offering instruction on a flexible timetable, and with a focus on acquiring skills.

Trump administration rejects inspector general's critical audit findings on Western Governors

Trump administration rejects findings from a 2017 inspector general audit that found the online giant WGU out of compliance and recommended that it pay back $713 million in federal aid.

In learning styles debate, it's instructors vs. psychologists

A nearly century-old idea about learning remains “ubiquitous” despite scant scientific evidence to back it up, many experts say. But others still see value in the concept.

Essay on the game-changing potential of precision academics

Mapping of the human genome is exponentially increasing evidence-based innovation in health care. Through precision medicine, medical treatments are personalized and tailored to the individual characteristics of each patient. Mapping and coding the knowledge, skills, attitudes and behaviors, our “academic genome,” and the emergent field of precision academics it will engender, will be equally catalytic for postsecondary education.

This premise of precision is resonating and growing in postsecondary education. For example, instructional thinkers at the University of Michigan and National University advance the most explicit descriptions of precision academics -- using learning telemetry and analytics to tailor instruction to the characteristics, needs, paths and paces of individual learners.

Competencies are the essential, elemental statements of the knowledge, skills, attitudes and behaviors students are expected to learn and demonstrate throughout, and to culminate, their educational journey through postsecondary education. Mapping its academic genome transforms an institution from reliance on the anecdotal, individual implicit objectives of course- and lesson-level teaching to the explicit -- collaborative, digitally linked, aligned and evidence-based learning objectives and outcomes of a digital academic enterprise.

The transformation of postsecondary education from its long and much beloved history of manual, place-based, faculty-paced, master-crafted model of institutional teaching to one of networked, open, student-driven, digital, dynamic and individual learning is underway. Academia has spent years mapping our curricula, measuring college learning and drafting and reporting student learning outcomes. To paraphrase William Gibson, the academic genome is already being mapped, it’s just not evenly distributed -- or sufficiently connected -- yet.

The benefits of this mapping are numerous. They include assignments, assessments and outcomes explicitly aligned to course and program outcomes, to careers, providing more robust and informative documentation for quality assurance, more dynamic student records and better ways to publish those records for different uses and stakeholders.

In their book Degrees That Matter, Natasha Jankowski and David Marshall describe a “learning systems paradigm” that builds on the work of hundreds of postsecondary institutions using the Degree Qualifications Profile, Tuning and the VALUE Rubrics of the Association of American Colleges & Universities to map what is their local academic genome. This work makes possible the specification and coding of digital, machine readable cyberobjectives and cybercompetencies, which are finely grained, unambiguous, actionable statements of instructional intent. The resulting continuous telemetry of real-time, digital data enables evidence-based analytics and adaptivity developments which are increasing the power and efficacy of learning systems.

Maybe you think that is great … maybe you don’t. But it is happening -- all around us.

Public K-12 systems in many states, including Georgia, are coding and connecting their learning standards together. The 50-State Digital K-12 Academic Standards Registry (announced by IMS Global Learning Consortium) enables coding, sharing and linking standards across every state in the registry. Richard Woods, Georgia’s school superintendent, said, “We see this not just as the adoption of a technical standard, but as a mechanism to help us realize one of our strategic initiatives -- the move toward personalized learning for every Georgia student.”

The Competency Model Clearinghouse, developed by the Employment and Training Administration in collaboration with other federal agencies and work-force development experts, documents the skills and competencies required in emerging and economically vital industries.

The Occupational Information Network (O*NET), developed under the sponsorship of the U.S. Department of Labor’s Employment and Training Administration, has digital descriptors for almost 1,000 occupations covering the entire U.S. economy. Every year, O*NET is used by millions to find the training and jobs they need, and by employers to find skilled workers.

The T3 Innovation Network under development by the U.S. Chamber of Commerce Foundation is building the talent development pipeline and marketplace of the future and using digital Web 3.0 technologies to better align learner, education and work-force data to improve the talent marketplace.

To connect with these digital talent development pipelines, postsecondary education must also become more digital. And there is progress. The Credential Connection is a collaboration of more than 120 postsecondary and work-force organizations. It is curating a highly diverse and fragmented credentialing system into one that evidences educational quality, increased access and better alignment with employers, education and certification/licensure agencies.

Digital Drives Analytics -- Analytics Drive Performance

The future of postsecondary instruction (and learning … and institutions) will be more and more digital. If you have not heard of Georgia State University and predictive analytics, you are missing the awakening of academia to the use of data that significantly informs and improves the performance success and attainment by students. By using data on past performance to predict future problems, GSU has been able to develop, time and deliver effective intervention strategies that have increased degree awards by over 67 percent in six years. The influence on other University System of Georgia institutions is significant. No one shows up at a budget meeting without performance data now. If institutions can make this kind of improvement with static, legacy data, just imagine what they could do with real-time data telemetry from students -- as they are learning.

The emergent field of learning analytics is bringing just this kind of evidence-based telemetry analysis and practices to lesson and course-level design and deployment. Faculty are crossing the chasm from handcrafted to technology-enabled instruction. Want a glimpse of what that might look like? Watch this video. That’s what got (and keeps) me going down this road.

A visual, playful learner seeing the ability to explore and manipulate a digital curriculum, and thus digital courses, is totally captured in the 1:41 it takes Professor Euan Lindsay at Charles Sturt University in Australia to explain how their engineering curriculum can be personalized to the individual student. Imagine being able to take this to the lesson level, with learners and faculty as active co-conspirators in the game of learning. This is what the digital mapping and coding of a curriculum is all about.

Imagine further taking this granular data into a comprehensive learner record. The ability to learn once and render many ways will change expectations and drive the radical transformations George Mehaffy envisions in “Reimaging the First Year of College.”

What Do We Need to Do?

Watching this transformation in other sectors, we can clearly see technology will drive change faster and with greater disruption than we can currently imagine. For generations the currency of academia has been the credit hour. The cybercurrency of the educational future will not be credit hours. It will be cybercompetencies. And there are several things we can do to prepare.

Continuing to map the academic genome is the first step. Mapping our current, handcrafted syllabi transforms them into actionable cyberobjectives, competencies and connects them to the digital ecosystem of learning, credentials and careers.

Build from local curriculum maps to national frameworks using open, shared platforms to collect, aggregate and crosswalk frameworks and their components into resources that inform, and power new instructional and institutional designs rather than maintain the existing limitations on them.

Take the next step by coding the academic genome. Mapping is not the endgame. By coding learning objectives, competencies and outcomes into fine-grained, unambiguous, machine readable/actionable, digital elements we will open and operationalize new continuous telemetry learning models. Good coding, like building good assignments and assessments -- like good teaching -- is hard, but necessary and rewarding work.

Establish an open-source competency ecosystem. We need competencies to be open, shareable and connectable within and among both third-party and open-source platforms. Everyone will benefit from resources connected under common technical constructs. This "edusystem" of competencies will be a resource for faculty building courses, aligning learning to industry competencies and communicating out student proficiencies to transcripts and employers.

Embrace the next paradigm: precision academics. A digital academic genome doesn’t just change the possibilities -- it changes the game. All of the preceding work prepares us to move from episodic dives into programmatic design and redesign to the truly transformative leap into an evidence-based, ongoing engagement in a dynamic curriculum that is continually informed and refreshed.

Digital will drive innovation in postsecondary instruction for the foreseeable future. Technology is fusing with the expertise of faculty in ways that engage, enable and extend the best abilities of each. How much it will transform has yet to be seen.

Myk Garn is assistant vice chancellor for new learning models at the University System of Georgia.

Image Source: 
Istockphoto.com/gremlin
Is this diversity newsletter?: 
Newsletter Order: 
3
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

Credential Engine seeks to map the credential landscape

The Credential Registry is several months into its mission to document all U.S. credentials, but the finish line is further than ever.

Risks of the Trump administration's next push to deregulate higher education (opinion)

Here they go again. After halting and gutting two major rules that were put in place to safeguard students and taxpayers from predatory and abusive practices, Education Secretary Betsy DeVos is apparently planning yet another round of deregulation that will dismantle key protections against fraud, waste and abuse, under the guise of flexibility to promote innovation.

Unlike the last Republican administration, this one seems to lack any proactive agenda to make higher education more affordable and improve student outcomes. Instead, DeVos’s only agenda so far is to undo critical policies through an ongoing assault on meaningful reforms. This latest round of reckless deregulation seeks to unravel several rules that reflect years of input, negotiation and thoughtful deliberations. Enacted largely at the recommendation of the department’s independent watchdogs in the inspector general’s office, several of these safeguards survived repeated legislative repeal efforts. Then came DeVos.

Let’s start with the credit hour. As the government helps millions of students pay their tuition bills with over $150 billion in funding each year, it needs a unit of measurement to know what it is paying for: education. While imperfect, the credit hour has served for over a century as the only consistent metric of academic workload, for students, faculty, institutions and the government. Simply put, both eligibility for and the amount of federal aid awarded are based on the number of credit hours enrolled.

Without a common metric, though, the system breaks down. After finding that colleges inflated and overawarded credits to rake in additional aid, and that accreditors failed to hold the schools to any standard for how many hours of instruction equaled a credit hour, the inspector general asked Congress and the department to create a standard definition and clarify how accreditors should review credit-hour policies. Defining the credit hour -- the cornerstone of federal student aid -- would limit the most egregious cases of credit inflation, where students and taxpayers wind up paying more for less education.

The department issued a rule in 2010 that established a flexible yet consistent definition only for student aid purposes that could accommodate everything from a traditional classroom lecture to a self-paced competency-based course. Rather than measuring solely the “seat time” students were expected to spend in the classroom, it instead allowed institutions to add up the amount of work required to complete the material.

Still, as with almost any regulation, the credit-hour rule was criticized by the industry and its congressional allies for creating more compliance burdens for colleges and accreditors and stifling innovation. Critics’ most dire predictions have since been soundly disproven by the explosive growth of competency-based education programs. And while the amount of instruction and student work may not be perfect proxies for education, there is no viable alternative to replace the credit-hour definition -- no other widely accepted measure of learning -- that can be readily applied throughout higher education.

As if repealing the definition of a credit hour would not be bad enough, its effects would be even more significant when paired with another change DeVos reportedly plans to make: redefining the requirement for regular and substantive interaction between online students and their instructors.

When Congress decided to give online programs equal access to student aid in 2006, it wisely included a common-sense requirement that they provide “regular and substantive” interaction between students and instructors. Designed to avoid subpar education and to differentiate online from correspondence education -- in which students receive course materials and learn on their own, typically without ever communicating with a qualified instructor, and for which federal student aid is more restricted -- this means that for a college to charge the taxpayers full price, the education needs to include regular contact with subject-matter experts, about the subject matter.

A recent inspector general audit of Western Governors University has brought much attention to this issue, but not all colleges that stand to benefit have WGU’s affordable price and good outcomes. In fact, many have the exact opposite: high prices and poor outcomes. The limitations on correspondence education grew out of the gold rush of low-quality, for-profit colleges that engaged in fraud and abuse in the 1980s and 1990s. And the for-profit sector’s heavy online presence -- 60 percent of its students are enrolled entirely online -- means that it would likely benefit significantly from a change to the rules, relaxing the requirement that faculty engage in their students’ education and allowing those schools to dramatically reduce one of their biggest costs -- instructor salaries -- and pocket the difference. Most importantly, taking teaching out of the equation raises fundamental questions about what higher education means, and what taxpayers are really paying for.

Amazingly, these are not the only rules slated for elimination. DeVos also plans to reconsider -- and, one can reasonably expect, weaken -- rules governing two legs of the so-called higher education triad. Together, accrediting agencies, states and the department are meant to ensure the integrity of student aid programs. Since it was signed into law in 1965, the Higher Education Act has required all colleges that want to participate in federal aid programs to be authorized as postsecondary institutions by the states in which they operate. In other words, just like for any other business, colleges need a state license to open up shop. Despite this long-standing requirement, some states had not even set up a process to do so.

To address this discrepancy, a 2010 rule required all institutions to have state approval and all states to have a student complaint system. It was implemented several years later. Again, critics cried foul and predicted that colleges would be shut down, but repeal efforts failed and, to our knowledge, not a single institution has closed as a result of the department’s enforcement of the rule. Then in 2016 the department closed a loophole by applying the same requirement to online colleges, which is scheduled to go into effect this summer. DeVos evidently wants to dismantle both, despite no evidence of negative or unintended consequences, and also to revise the already-weak regulations governing the requirements accrediting agencies must meet.

This deregulatory agenda constitutes no less than an attack on the very heart of higher education. The latest regulations on the chopping block are designed to ensure a baseline measure of institutions’ engagement in ensuring the quality of higher education, to preserve the integrity of the federal student aid and to protect the students and taxpayers whose voices are too often drowned out by those of industry. There’s just too much at stake for us to fail to look beyond the rhetoric of flexibility for innovation and ignore the far-reaching implications.

Spiros Protopsaltis is associate professor of education policy at George Mason University’s College of Education and Human Development and served as deputy assistant secretary for higher education and student financial aid at the U.S. Department of Education during the Obama administration. Clare McCann is deputy director for federal policy at New America’s higher education initiative and served as a senior policy adviser at the U.S. Department of Education during the Obama administration.

Image Caption: 
Betsy DeVos, the U.S. secretary of education
Is this diversity newsletter?: 
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

Skills disconnected from academic programs shouldn't matter to colleges (opinion)

Skills do not matter.

Let me say that again. On their own, skills do not matter.

This is worth saying in response to Thursday’s Inside Higher Ed story stating that the American Council on Education will “team up with the digital credential provider Credly to help people put a value on skills they have learned outside college courses.” The initiative, funded by the Lumina Foundation, is, in the words of ACE’s Ted Mitchell, “about creating a new language for the labor market” in which skills-based competencies are valued and credited.

It’s wonderful and important for employers to develop their employees’ skills, but colleges and universities need not take notice, because these efforts are irrelevant to collegiate education’s goals and purposes.

One way to think about why skills do not matter is by analogizing to other kinds of education. Imagine your employer provided you a manual dexterity class where you learned to move your fingers about effectively. Now imagine that you came to a guitar teacher and asked for credit. Certainly, guitar players need to have manual dexterity, but the guitar teacher would wonder why you deserved credit. Learning dexterity absent actually playing guitar is not particularly valuable. It certainly does not mean that one can play guitar, nor that one has understood guitar nor embraced the purpose of studying guitar. It’s a meaningless skill from the perspective of a guitar teacher.

The same can be said of a karate teacher. Imagine that your employer had taught you to kick but had never introduced you to the specifics of karate. Do you have a “karate competency” because karate also requires kicking? Of course not.

Instead, a good karate instructor will point out that kicking abstracted from the context of learning karate is not particularly relevant to the task at hand. It will not teach one how to kick within karate, nor embody the values and discipline that a karate instructor intends to develop in her or his students.

The same is true for college professors committed to ensuring that students graduate with a liberal education. Certainly, being successful in the arts and sciences requires high-level cognitive and academic skills. But those skills are meaningless unless they are learned within and devoted to the purposes of liberal education.

In short, offering college credit for disembodied skills is as much a mistake as a guitar instructor offering credit for manual dexterity.

How, then, should colleges and universities understand skills? For starters, they should always see them in relation to the specific ends of the programs that they offer. This is as true for vocational as for liberal education. The skills of a carpenter or a nurse or a car mechanic are not isolated but are interconnected and oriented to the end of wood construction, providing health care or repairing engines, just as a guitar teacher’s goal is to impart knowledge and techniques in relation to playing the guitar.

For four-year colleges and universities, on the other hand, the skills that matter should be related to their primary mission of offering every undergraduate a liberal education. At such institutions, academic skills should be developed in the context of, for example, reading and writing about literature or history or engaging in scientific inquiry.

A liberal education is not just any kind of education. Like carpentry, nursing or guitar playing, it has content. It seeks to cultivate specific virtues through specific practices. For example, the goal of a historian is not to teach abstract skills (such as parsing evidence or writing papers) but to help students engage in intellectual inquiry about the past. This means that skills are developed within the context of reading and writing history. The end is historical perspective, and the skills are means to that end. From the perspective of a historian, it matters little whether someone has good skills unless they also have learned to value history and to develop historical insight.

In addition, skills, from the perspective of four-year colleges and universities, are meaningless outside studying specific subject matter. If colleges and universities want students to care about and think with the arts and sciences, students need to spend their time studying the arts and sciences.

Indeed, scholars of teaching and learning have made clear that critical thinking skills cannot be abstracted from the material that one studies. As James Lang writes in his book Small Teaching: Everyday Lessons From the Science of Learning (2016), “Knowledge is foundational: we won’t have the structures in place to do deep thinking if we haven’t spent time mastering a body of knowledge related to that thinking.” That is because the ability to ask sophisticated questions and to evaluate potential answers is premised on what one already knows, not just on skills abstracted from context.

Thus, if the goal of four-year college education is liberal education, we need students to study subject matter in the humanities, social sciences and natural sciences. Students need to engage seriously with history and politics, or economics and physics, before they will be able to think critically about history or politics or economics or physics. This takes time. Assessing skills cannot, and certainly should not, be done outside the context of the subjects one ought to study in college.

This is not to deny that employers should invest more resources in developing their employees’ skills, nor to suggest that those skills don’t matter within the context of specific employment markets. There are many reasons to celebrate public and private efforts to develop Americans’ work-force skills, and doing so can benefit both employers and individuals.

It simply matters little to the kinds of things that one should earn college credit for. Employers’ goals are not to graduate liberally educated adults, but to generate human capital. Generating human capital may also be a byproduct of a good liberal education, but it is certainly not the goal of it.

In fact, a good liberal education asks students to put aside, even if just for a while, their pecuniary goals in order to experience the public and personal value of gaining insight into the world by studying the arts and sciences. This is the end, the purpose, the reason for a college education. Whatever other purposes students might bring to their education, and whatever valuable byproducts emerge as a result of their time in college, colleges and universities should remain true to their academic mission.

Johann N. Neem is a professor of history at Western Washington University. He is the author of Democracy’s Schools: The Rise of Public Education in America (2017). The ideas in this essay draw from "What Is College For?" in Colleges at the Crossroads: Taking Sides on Contested Issues (2018).

Is this diversity newsletter?: 
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

Lifting the fog on the data desert surrounding the microcredentials universe (opinion)

We hear lots of chatter from the cheerleaders of disruption about the “alternative” or microcredential provider universe. They avow that this universe is Trumpian “huge” -- and growing -- and that the alternative sector has the potential to greatly improve outcomes across the board in postsecondary education.

But most higher education commentators and analysts would agree that we know very little about participation rates, completion rates and demographics of students in the microcredential provider universe. My critique of the paucity of data in this alternative universe goes beyond the emptiness of the promises about the sector to reflect the genuine concern of public policy for greater equity in participation and outcomes in postsecondary education.

I have spent some time on the websites of 72 providers and microcredential facilitators, reading the “about” and “programs” sections of those sites, following the links, contemplating the accompanying blogs, and digging out the teaspoons of data one finds under rocks and behind fences.

Greater equity, as we normally understand it, means closing gaps in participation and completion by race/ethnicity, but you will not read those words on the websites of the cheerleaders or in reports on this territory by august bodies. It’s an issue studiously avoided. Gender sometimes; age sometimes. Race/ethnicity, no. The data are not there, and the minor attempts to provide them are pathetic.

The silence is deafening. If one found low participation rates for minority students, that would undercut any claims the “alternative” sector might advance for its contribution to equity in postsecondary education. The federal Integrated Postsecondary Education Data System (IPEDS) does not avoid these data for any students in its 7,000-plus Title IV institutions of postsecondary education, nor does the recently released NCES Household Education Survey (2017). By law, they cannot duck, and don’t.

One must acknowledge the counter to this critique up front: it is not in the mission of the alternative universe of microcredential providers to address any demographic disparities among the populations who pursue education of various kinds after high school. Title IV institutions may have an equity mission; those outside that universe do not, and should not be held to our standards, even in the simple matter of the data they keep.

My stance is otherwise: we are in this together, and if for-profit Title IV institutions keep data, so should the mostly for-profit institutions outside Title IV. All institutions, organizations and enterprises offering education and training to U.S. residents are serving a common population. None can claim exemption from telling the public whom they serve and for what, even if an equity mantra is not in their public mission or “about” statements.

What Stands Out From This Balloon? And What Should We Say About It?

Standing out, principally due to their visibility in the blogs of disruption and the cheerleading columns, are accounts of “skills-based short courses,” MOOCs and other nanocredentials. This ephemeral universe brags more on its websites about job placements, corporate destinations and earnings of “graduates” than the ratio of completers to beginners, i.e., more about what than who.

Do they provide credentials? You sure can get badges from some, and MOOCs will give you some kind of electronic confirmation of completion that you can call a credential (if you pay for it), or, if you complete more than three or four, a nanodegree. But again, these are data deserts.

How do we assess student volume and characteristics in these providers? Where we have reports of enrollments on their websites, they are spun in fantastic round number estimates: 160,000 (www.ucertify.com), 215,000 (www.canvas.net), four million (www.edX.com), “15+” million (www.Lynda.com). As for credential awards, we find much lower, but still mostly rounded estimates: 500 to date (www.fullstack.com); 1,300 to date (www.startupinstitute.com); 1,700 (www.Appacademy,com).

In that universe of 72 providers of such credentials or completions that I followed on their websites, only 10 offered any information on enrollments (including, for our amusement, “hundreds”), and only 12 on completions (principally cumulative “alumni” or a sample of head shots of alumni). Demographics? That’s something just about no one knows because very few of these organizations provide data on anyone to anyone else.

Doubt it? Go online to the websites of App Academy, BadgeOS, Badgr, Bloc, Coding Dojo, Coding House, Degreed, Epicodus, Flatiron School, Fullstack Academy and on and on and see if you can find any data on enrollments or completions and the demography of those who enroll or complete. You can’t.

How Course Report comes up with statements that boot camps will graduate 22,949 in 2017 when only one of the boot camp badge providers, Startup Institute, reports any numbers on enrollments or graduates must remain an eternal mystery. General Assembly claims 35,000 “alumni” but there is a genuine question of what “alumni” means, as it can include folks who register Monday and are gone by the following Wednesday. If we want to see what the recent American Academy of Arts and Sciences report authors call “rigorous research” on this playing field ("The Complex Universe of Alternative Postsecondary Credentials and Pathways"), one has to start with such digging. Did I get it all? No, but it’s not beyond reach.

There is another, and more generic, problem with the data: when you find them in footnoted references, they are estimates based on tiny samples from the putatively authoritative Class Central and Course Report. How tiny? Six hundred and sixty-five in 2015; 1,143 in 2016 -- all self-selected. These are not what statisticians call “true populations.”

By contrast, the recently released 2016 National Household Education survey from NCES started with a weighted sample of 47,000, and the national longitudinal studies run by NCES also start with roughly 20,000 to 25,000 true population students drawn precisely in order to be weighted to convincing national portraits. Beginning Postsecondary Students, Baccalaureate and Beyond, the Education Longitudinal Study of 2002 -- these are not fake news. The footnoted worlds are, and we’ve got to fix that if we are fully to understand what people engaged in learning do after high school.

Then There Are the MOOCs

The MOOC universe is colossal, or so we are told, and nearly always by estimates. So when the providers themselves are the direct source, we are never sure whether numbers such as “15+ million” are cumulative, annual or hallucinatory. Yet from estimates in the Harvard Business Review, we can reasonably assume that only a third of MOOC enrollees are domestic.

If so, should not one ask, and in a pointed way, whether full data accounting by MOOCs and IT certifications in particular, should include populations outside the United States? Coursera offerings come in eight languages, and partner universities are in at least a dozen countries, so whatever one sees of enrollments and completions (something of a zero) is hardly domestic. Again, on Udemy’s website we find an estimate that two-thirds of its (self-estimated 17 million to date) students are outside the U.S.

Failure to deal with country of origin issues leaves us with no way of knowing how this factor clouds interpretation. Offering a geographic distribution in terms of courses offered, not students, as Class Central does, is not very illuminating, either, not if we want to know whether the MOOC and microcredential universe is shrinking gaps in U.S. higher education participation and completion.

For 2014, Canvas.net reported 214,997 enrollments (with age and gender breakouts based on a 20 percent response rate), but we have no idea how much of this was domestic. Not confronting this issue is the most serious data oversight of the American Academy of Arts and Sciences’ recent report on this territory.

As was the case in the universe of boot camps, the only quasi demographic that turns up consistently in the MOOC universe is the proportion of enrollees or credential recipients who had previously earned bachelor’s degrees. Not surprisingly, when that number is either offered or estimated, always in percentages, it’s very high: 64 percent (Udacity.com), 73 percent (www.edX.org), 78 percent (www.turingschool.com), 83 percent (www.startupinstitute.com). What that obviously means is that the alternative credential universe of boot camps and MOOCs (along with IT certifications) is not contributing as much to the national undergraduate completion agenda as some of its cheerleaders would have.

In fact, it can be argued that since such a huge proportion of participants already hold degrees, these “pathways” are not “alternatives” to degrees, either. In turn, that would affect the nonexistent race/ethnicity data, and much to its own chagrin. Let’s be simple about the impact: Can you sell something to minority students in which they are largely excluded -- or don’t even exist -- to begin with? And can you sell that same something to minority students if you cannot demonstrate that it is working for them? That means real, hard data.

How Do We Get Better Data?

Everyone agrees that we’re missing a lot of information from the “alternative credential” universe, a universe that nonetheless claims millions of students. Presumably, we would want to account for these individuals in our national data portraits, if for no other reason than to document the full extent of postsecondary participation and completion, particularly for standard demographic groups that “traditional” higher education accounts for through IPEDS.

What do we do about it? First, we are not going to sweep these providers into the IPEDS system, because they are not Title IV institutions. No federal data collection can touch them. Nor will anyone get what the AAAS report recommends for longitudinal studies, which require student-level data that no organizations other than the National Center for Education Statistics and the U.S. Department of Labor can produce, e.g., for grade-cohort studies such as ELS-02 or event cohort studies such as Beginning Postsecondary Students, and which would come with a price tag north of $80 million. There are a lot better ways to spend that kind of public money.

The NSC Solution

So, I propose using the nonfederal National Student Clearinghouse instead, and NSC is open to the idea. But -- and it’s a big but -- one cannot compel these organizations, ranging from small boot camps to the huge MOOC providers to corporations and associations granting certifications, to report anything (let alone the kind of data outlined below) to anyone. So what do we do about it?

First, we bring together a coalition of the major higher education organizations, chairing it with the American Council on Education. They, in turn, write a joint public email/letter to the cognizant authorities of every noncollegiate boot camp, MOOC offerer, industry certifying authority and mixed course provider, inviting them to submit annual enrollment and completion data to the National Student Clearinghouse, and providing the explicit benefits of doing so.

The principal benefit for microcredential providers, as NSC’s Doug Shapiro points out, lies in participating in a nationally standardized system for student tracking, hence gleaning a mantel of credibility and recognition, along with a trusted partner. For higher education, through its representative national organizations, a new sector of microcredential providers would be recognized instead of hidden, feared, fantasized or berated, and its contributions to national policy objectives made explicit. This is, after all, a large set of providers of postsecondary education that lies outside accreditation, Title IV and federal data systems. Everyone wants to see a full picture, and you are part of it, dear microcredential providers. NSC is the best route out of the shadows.

Shapiro also cites the benefits to microcredential providers for the potential of linking their awards to other labor market indicators in an established system, to which I would add: instead of relying on guesswork or self-selected “alumni” samples or counting “hello/goodbye” people as “alumni.” That, too, raises the issue of the range of information these providers could add to national accounting, and that a letter from the higher education organizations might suggest.

I’d keep the data collection boundaries simple at the outset until the organizations that join get their sea legs, so to speak: domestic enrollments and completions by gender, race/ethnicity, age and prior level of education. If that means the alternative providers have to engage in some institutional research and hire the folks to do it, that’s what it takes to join the universe, receive due recognition, be included in national reporting and see one’s records open to students who otherwise would wonder where their credentials sit in the universe. The opening gambit of the higher education organizations should not frighten the microcredentials away by asking for more.

Beyond NSC: A Second Critical Piece of Data Collection

A second data territory the American Academy of Arts and Sciences report unintentionally raises is that of credit connections between “alternative” providers and institutions of higher education. In the trade press and blogs we get anecdotes and a host of undocumented claims about such activities, but nowhere can anyone find a comprehensive account of who among postsecondary institutions provides credit, from what and for what.

To track the interactions among alternative sources and institutions of higher education requires aggregate numbers, by higher ed sector and “alternative” source to yield statements such as, e.g., 46 community colleges gave additive credit for 291 apprenticeships; 173 community colleges gave additive credit for coding badges, 280 four-year colleges accepted completed MOOC courses for credit, etc. Furthermore, as this is ultimately a student-level question, each granting of credit statement would be accompanied by the requisite demographics: gender, race/ethnicity, age, highest prior level of education.

This is not the territory of the National Student Clearinghouse, though it would require a parallel survey, however specialized, of all credit or credit-equivalent accredited institutions of higher education. Like the NSC case, though, there is no nonfederal authority that can compel institutions to report anything. But there is no other way of documenting and detailing the formal connection between the “alternative” and traditional higher education sectors, and getting a detailed handle on what we are/are not doing for minority students in this sphere.

Thus I take a leap of faith and call for a major higher education organization to step forward, gather others of similar weight and petition NCES for IPEDS to add some very simple data questions to its 2020 survey, questions on the order of “How many of your students were granted credit during the academic year under consideration for course work or course work-equivalent completion from each of the following sources: apprenticeships, industry certifications, boot camps, MOOCs.”

Do it once, in 2020, to see whether we get anything truly worth noting, anything beyond mythology. Do it once to see how well these credit grantings are distributed -- or not. And if institutions know now that these data will be requested in 2020, that gives them enough time to prepare.

I admit this is a hope, not an agreement. All we can do is advocate. Let that advocacy start here.

Clifford Adelman is author of A Parallel Postsecondary Universe: The Certification System in Information Technology (2000), The Toolbox Revisited: Paths to Degree Completion From High School Through College (2006) and The Bologna Process for U.S. Eyes: Re-learning Higher Education in the Age of Convergence (2009), and is a co-author of "The Degree Qualifications Profile" (2014).

Image Source: 
Getty Images
Is this diversity newsletter?: 
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

Pages

Subscribe to RSS - Competency-based learning
Back to Top