Inside Higher Edrecently checked up on adoption of badges specifically, and alternative credentialing generally, with a look at early adopter Illinois State University’s rollout of a badge platform. The overarching goal of badging and alternative credentialing initiatives is very valuable: to better communicate the value and variety of people’s skills to employers so that it’s easier to connect with and improve job outcomes. Yet the focus on badges and alternative credentials is like trying to facilitate global trade by inventing Esperanto.
The conception, theory and adoption of badge-based alternative credentialing initiatives starts as far back as 2011, when Mozilla announced the launch of its Open Badge Initiative and HASTAC simultaneously made “Digital Badges for Lifelong Learning” the theme of its fourth Digital Meaning & Learning competition. In the five years since, much has been written and even more time spent developing the theory and practice of alternative credentialing via badges -- from Mozilla and its support by the MacArthur Foundation to Purdue University’s Passport, to BadgeOS and Badge Alliance. Lately, the Lumina Foundation has taken the lead promoting alternative credentialing, most recently participating in a $2.5 million investment in badge platform Credly and a $1.3 million initiative to help university registrars develop a “new transcript.”
The premise behind all of the badge and alternative credential projects is the same: that if only there were a new, unified way to quantify, describe and give evidence of student learning inside the classroom and out, employers would be able to appropriately value those skills and illuminate a path to job outcomes. These kinds of premises often lead to utopian, idealized solutions that imagine transforming society itself. From Lumina’s “Strategy 8” overview:
To maximize our collective potential as a society, we need a revamped system of postsecondary credentials -- a fully integrated system that is learning based, student centered, universally understood and specifically designed to ensure quality at every level.
The problem for Lumina, Mozilla, Credly and the rest is that they’re proposing to replace a rich variety of credential “languages” with a universal one that’s not just unnecessary, but that’s modeled on fundamentally flawed analogies and observations.
I’ll start with the flaws of badges as a credentialing solution. Early on, digital badges often used Boy and Girl Scout badges as an analogy, but the more direct precursor of the current generation of badge solutions is video games. Indeed, attaining badges for completing certain tasks or reaching certain milestones is such a core feature of video game design and experience that the whole practice of rewarding behavior within software is referred to as “gamification.” This approach became widespread (with the launch of Foursquare, Gowalla, GetGlue and dozens more) in the years just preceding the launch of digital badges.
Yet video game badges -- and the badges employed by gamification companies -- are not truly credentials, but behaviorist reward systems designed to keep people on task. As credentials, their only useful meaning was within the systems in which they were earned, specifically within a given video game or bar-hopping app. Scout badges have a similar limitation: whatever their value in motivating attainment toward a worthy skill or outcome, the meaning of those badges is difficult to assess for nonscouts, or those not trained in the visual language of scouting badges.
Badge adherents aim to address the “value” and portability of badges by attaching proof of skills to the badges themselves. This is the same idea behind e-portfolios: that evidence of each skill is not just demonstrable, verifiable and universally understood, but useful to employers. Yet outside of specific fields, portfolios simply don’t matter to employers. As Anthony Carnevale, director of Georgetown University’s Center on Education and the Workforce, told The Chronicle of Higher Education earlier this year about the New Transcript portfolio, “Employers don’t want to take time to go through your portfolio -- they just don’t.” Where evidence of skills is important and useful, solutions already exist: GitHub for software developers; Behance for designers; transcripts, essays and recommendations for graduate school.
The idea of replacing university “dialects” with a new language of skills and outcomes is less metaphorical when think tanks and ed-tech companies talk about alternative credentials as a category. There, advocates propose an entirely new vocabulary: microcredentials, nanodegrees, stackable badges and more, all meant to convey (to employers primarily) the body of skills and knowledge that a student possesses. But they are redefining concepts that already exist, and that exist productively for the marketplace of students, educators and employers.
Consider the stackable badge, the idea that learning competencies should be assessed and verified in a progression that comprises and leads to a certified credential. But stackable credentials already exist in ways that everyone understands. In the undergraduate major, a student completes a series of related and escalating levels of mastery in a given subject area, assessed by experts in that field. Upon completion of those microcredentials -- i.e., classes -- the student is awarded a degree with a focus in that field and with an indication of attainment (honors). The same goes for hundreds of areas of expertise inside and outside higher education: in financial analysis (the extremely demanding and desirable CFA designation), entry-level and advanced manufacturing (the National Association of Manufacturers MSCS system), specific IT areas of focus like ISACA and (ISC)2, bar exams, medical boards, and more.
Credentials, in and of themselves, are a solved problem. I know this because my own company, Merit, launched the biggest, most comprehensive badge experiment that no one has heard of. Between 2011 and 2014 we tested a variation of the scout model -- a badge-based visual language of college milestones and credentials analogous to a military officer’s dress uniform -- that could be quickly read to convey a person’s skills, accomplishments and level of achievement. Nearly 500 colleges granted more than three million students almost 10 million badges that included academic honors, notable cocurriculars, experiential learning, internships and more. We tested interest by employers, educators and students (and continue to). What’s clear is this: it’s far, far more important to simply document existing credentials than to invent new ones, or a new language to describe them. Stakeholders in the high-school-to-college-to-career pipeline understand and value credentials as they exist now, and rarely need or want a new way to understand them. They just want to see them.
Connecting students’ skills and ambitions to the pathways to a career is a big deal, but it doesn’t require a new language that’s based on techno-solutionist fantasies. LinkedIn, the “economic graph” that many hold up as a model, needed more than $100 million of private capital for something as simple as convincing managers and a certain professional class to keep updated résumés online. Doing something similar for every single student is both more valuable and more difficult -- and doesn’t need to reinvent the entire language of credentials to complicate the effort.
My biggest frustration with badges and alternative credentials isn’t that they are an ivory tower solution to a real world problem. It’s that helping students succeed means more than figuring out a new language. Higher education is a demanding, high-stakes endeavor for the vast majority of students. Proposing that they -- and the institutions educating them and the employers who might hire them -- learn a new lingua franca for conveying the value of that learning, every year, over the very short time that they’re mastering the skills and knowledge that they need isn’t just impractical. It’s unfair.
Colin Mathews is founder and president of Merit, a technology company focused on creating and sharing stories about students’ successes.
Coursera’s recent pivot, following the departure of its founders, from saving the world to providing corporate training might tempt us to indulge in MOOC schadenfreude. That would be unfortunate. After all, MOOCs weren’t invented by Silicon Valley start-ups. They were invented by teaching faculty and co-opted by vendors.
Most of the time, ed-tech tools and services are rooted in innovations in teaching practices that were conceived, tested and refined by educators. But in the process of promoting technology, the pedagogy often gets lost. What we are left with is some product for sale attached to some (usually inflated) claim of benefit without the connective tissue of the teaching idea that makes it all work.
Companies like Coursera and Udacity co-opted MOOCs without crediting the original pedagogical innovations and aspirations behind the concept. Silicon Valley turned the concept it into a buzzword and then, having wrung all the meaning out of it, abandoned the field. You can blame the companies for glorifying products over practice. Or the media for indulging in the fantasy (yet again) that somehow software will fix everything. Or politicians and administrators who wanted to believe in silver bullets.
Ultimately, it doesn’t matter who is to blame. An idea with the potential to improve and transform teaching practice was obscured and devalued through a process that might have explained it, promoted it and made it easier to adopt.
As someone who has spent much of my career in education technology, I’m generally an optimist. Technology, when used well, can empower educators to create effective learning experiences and scale powerful teaching. But faculty must play a role in ed-tech development and implementation if we’re see to those effective innovations come to light.
This is the lesson from Coursera’s story that we should care about. As MOOCs move through the Gartner hype cycle from the “trough of disillusionment” up the “slope of enlightenment,” educators and institutions that do not abandon the form may very well retain, rediscover and recontextualize the original educational ideas that underpinned the potential of the massive open online course. Perhaps this period would be better named the “slope of rediscovery,” which all too often follows a massive waste of time and resources.
Those of us who care about quality education can do better. We can insist that tool makers and promoters maintain their focus on the core teaching insights that enable their offerings to provide value. And in doing so, we might mitigate the loss of purpose that seems to happen with every ed-tech hype cycle. How can we reclaim buzzwords and imbue them with meaning?
Consider the case of the buzz phrase du jour, “personalized learning.” Anyone who has taught knows that some students are easier to reach than others. Anyone who has taught regularly knows that there are structural reasons -- institutional, personal or other -- that can make reaching some students harder than it otherwise would be. Instead of allowing another meaningless buzz phrase to come and go, leaving chaos in its wake, why don’t we insist that conversations about personalized learning be about approaches and tools for reaching those students?
At e-Literate, we are creating a series of short explainer videos that we hope will change the conversation around the term and make it useful to anyone who cares about teaching quality. They are meant to feel like commercials -- in a good way -- and act like public service announcements.
The first video associates the term with a concrete teaching need.
Notice that we focus on goals and techniques, rather than features and products. We describe personalized learning as a collection of technology-supported teaching techniques for reaching hard-to-reach students.
The second video frames three buzz phrases -- flipped classroom, learning analytics and adaptive learning -- as ways to support three possible methods for achieving that goal.
To be clear, our purpose for producing these videos is not to persuade you to adopt any of these approaches. Rather, we want to reframe conversations between you and your vendors to make the outcomes more useful to you and your students. Could you improve your teaching if you had the right learning analytics at your disposal? Maybe. You are the person who is in the best position to answer that question.
When a vendor or provost or department head comes to you with a learning analytics product for your consideration, at least part of the discussion should be about whether and how the product’s capabilities might help you to reach your hard-to-reach students. The best way to ensure these conversations produce value is to come prepared with your own ideas and questions about how such tools could be useful to you in your context. The people presenting these products should come prepared to address your ideas and questions -- and maybe suggest some approaches you hadn’t thought of but that colleagues elsewhere have tried with some success.
Many vendors want to have this conversation and would benefit from it. The best way to sell a product is to convince the buyer that it will satisfy a specific need. The clearer you are in your own mind about your teaching goals and the kinds of tools that would help you reach them, the more specific the vendors can be in designing and promoting products that meet your needs. I believe that many people who work for these vendors genuinely want to help. Both Coursera and Udacity were started by educators who were inspired by their own experiences teaching MOOCs.
But it is easy for even well-intentioned companies to lose their way. They need educators to be clear and insistent regarding the kinds of help that would best serve them and their students. Faculty have an opportunity to influence ed-tech development and implementation. Reclaim the buzzwords and participate in the process. Your students will thank you.
I was recently having dinner with my dissertation adviser, Scott @shershow, catching up after many years, and at one point during the meal our conversation predictably drifted to something someone said on Twitter. Scott paused and said, “I must admit I don’t really get Twitter.”
He had joined Twitter maybe a year ago, had a couple dozen followers and was trying to become more familiar with it. But his admission suggested a murkiness and mysteriousness around the medium -- qualities we tend to forget after several years of obsessive tweeting and accumulating thousands of followers, retweets and likes.
My mentor may be near a tipping point: either ready to abandon Twitter, or just on the verge of getting it, to use his word. Without wanting to sound like a hyped-up social media evangelist, let me see if I can help. What can Twitter be for academics?
A way to write! Twitter can help make your prose stronger, clearer and, most important, shorter. We often get into bad habits when we write for narrow disciplinary audiences, and Twitter can help jostle you out of wordy discursive patterns that have become unconscious.
An archive. Twitter is a place to keep research findings: insights, startling juxtapositions and oddities are all at home on your Twitter feed. Use Twitter as a living archive, one that you can quite easily download to your hard drive every once in a while and comprehensively search. If you search for keywords or proper names, you may find threads and thoughts that can be expanded into larger investigations or arguments.
A venue in which to be cited. When you tweet your scholarship, you shouldn’t worry about someone scooping you. Realize instead that people can now reference you on Twitter and you can later integrate such points and rapid dialogues into papers, articles or books. Likewise, keep track of poignant remarks that you spot on Twitter so you can recall them later and weave them into something you are working on.
A great teaching tool. Create a Twitter assignment, like the one my colleague @twel in the English Department at Loyola University at New Orleans taught me, where students keep reading notes on Twitter, using a hashtag to create a live, interactive dialogue about your weekly reading. It’s also a way for you to interact with students. That can be risky, of course -- there are some things you’d rather not know about students’ late-night habits or existential crises. But the benefits outweigh such risks. Basically, it is a way to model to students not only how academic interests intersect with everyday life but also good interactive etiquette. Again, that can get dicey on Twitter, but even the worst-case examples of Twitter spats lend themselves to object lessons concerning written communication, the viral potential of the digital, the need to take time for reflection and how to be respectful within the strange realm of social media (and beyond).
A mode of communication. This may sound all too obvious, but once you fully embrace the wide reach of Twitter, it becomes a way to get the attention of all sorts of people and entities, including popular stars, politicians and airline officials during a flight cancellation. They may not always seem to hear or reply, but when they do, it can be quite satisfying. Look at how essayist and novelist @rgay engages readers, celebrities, critics and ordinary people of all stripes on Twitter -- talk about writing for an audience.
A way to promote your work. This isn’t just about becoming a shill or rampant capitalist. This is about using the tools at hand to help get your work out there to a real, reading audience. When your book is published, tweet about it. Look what philosopher @michael_marder did when his @objectsobjects book Dust came out this past January: he tweeted “dust specks” or little insights that came from and piled up around the book. You too can tweet little snippets from or aphorisms about your book when it is published, and even just one each day will help your book actually sell. And more important, this can help your book find readers. When I talk to editors about this issue (for instance, @mxmcadam at Johns Hopkins University Press and I have discussed this many times), they invariably tell me they prefer it if their authors are active on Twitter -- and for good reason. It is not only aiding the struggling and overwhelmed marketing efforts of publishers but it is also a way to do your work justice, to dare to be public about your intellectual work.
A critical platform. There is nothing like seeing the sharp television criticism of New Yorker journalist @emilynussbaum, the everyday analysis of sociologist @tressiemcphd or the home appliance criticism of media theorist @ibogost unfold in real time on Twitter. Twitter is a way to engage in lively critique: it is a vibrant medium for pithy reviews, trenchant commentary and subtle demystification. Of course you always set yourself up to be lampooned by a withering GIF or deflated by an ironic reply, but isn’t this a healthy thing for critics to keep in mind?
A community. The environmental policy scholar @raulpacheco started his #scholarsunday hashtag as a way to bring scholars together on Twitter, and it has been so successful that now it seems like every day of the week is Sunday. I’ve gone on to meet in person so many of the people I originally connected with on Twitter (including Raul himself), and that experience then reflexively rejuvenates the Twitter community. So if you feel like posing a question to a scholar you admire -- or just placing a question out in the seeming void -- there is a good chance that you will get a response, and usually it will be smart and useful. And then you may end up having a drink with your virtual respondent at a conference in the future, and possibly forming an ongoing friendship, professional collaboration or both
It is worth repeating No. 1: it is a way to write. You can actually draft entire essays, book chapters and conference papers on Twitter and then get live feedback as you go. It is scary sometimes, of course, to write in public -- to reveal your research before a legitimate outlet like a university press or a well-regarded journal has vetted or published it. But, in the end, this is a leap of faith that will almost always make the work better -- the end being publication elsewhere, like here. This piece started as a handful of tweets about how I use Twitter as an academic.
Christopher Schaberg (@airplanereading) is an associate professor of English at Loyola University New Orleans.