I’ve taught both theoretical and applied university classes in my academic career, and the opening lecture always has one thing in common: an invitation to my students to demand something of me. I tell them to insist they walk away from my course writing better, speaking better, thinking more analytically and a little more comfortable with numbers. Indeed, I urge them to insist the same of their university education writ large.
My efforts to frame courses like Sociology of Education and Social Science Research Methods around a set of broadly applicable skills aligns me with an “outcomes” orientation increasingly promoted by academic, business and political leaders. Yet whether my students achieved the four capacities I encourage or not, their college academic transcript will never tell.
If the answer to those who doubt the value of higher education is to trumpet the full educative impact of a postsecondary education, students deserve a credential that describes their full set of educative experiences. The time has come to extend the traditional academic transcript and begin issuing Postsecondary Achievement Reports (PARs), a verified summative document issued by colleges and universities that aligns and reflects each institution’s deeper educative goals.
While every institution could issue a PAR according to its own academic policies, what defines a PAR is a set of generally accepted conventions for the structure and technical formatting of academic transcripts that include co-curricular and competency-based information, along with traditional information such as courses, grades and credits. But before describing the PAR in greater detail, let’s first set some context for why colleges and universities have begun to think differently about how they document learning outcomes.
Defenders of academe are inclined to agree that transcripts and diplomas are insufficient credentials, though for very different reasons. As the scholar Andrew DelBanco argues in “College: What is, what was and should be,” the traditional four-year college experience can be an exploratory time for students to discover their passions and test ideas and values with the help of teachers and peers. If a degree is really about developing a whole person, and preparing them with humanistic education that will serve them in a very dynamic career landscape, surely a ledger of courses and grades alone is a poor reflection of that experience.
Indeed, institutions with a more vocational orientation face a similar challenge documenting the industry-skill certifications their graduates achieve on their way to conventional degrees.
It’s not surprising that, given these pressures, higher education has, in fact, put forth efforts to innovate the credential. Three distinct developments are already in process: co-curricular transcripts, competency-based transcripts and data-enabled eTranscripts. Together they lay the foundation for a new generation of academic credentialing. Co-curricular and competency-based transcripts innovate at the level of content and substance, extending the academic transcript. Electronic transcripts innovate the medium of credentials, enabling machine-readable data and analytics that can make student learning outcomes more easily understood and actionable.
Each initiative has successfully generated some momentum and adoption in the higher education community. For example, Northern Arizona University is doing innovative work documenting the student competencies that have been mastered via coursework, and State University of New York at Geneseo’s is continuing long-standing efforts to capture the student leadership, research, study abroad and other co-curricular experiences that define its vision of a postsecondary education.
While these institutions are already extending their transcripts, there are good reasons for concern that the grassroots nature of their innovations will conspire against its own success. Specifically, I fear a Tower of Babel if we do not find a way to converge around a lingua franca that describes the basic structure of such 21st century extended transcripts of the type being issued by pioneering universities across the country.
We take for granted the fact that transcripts make sense; we all expect to see a course title and number, a letter or number grade, in a sequence that is chronologically based. But transcripts are not actually standardized in any formal sense. My company, Parchment, exchanges millions of electronic transcripts each year. Our platform has been developed to help both sending and receiving institutions align and utilize the different information transcripts contain. For example, colleges may award different numbers of credits for essentially the same course. A-level work at one college may be B-level work elsewhere. Over time the academy gravitated toward a basic document structure, along with a strong professional code for issuing transcripts that remain a sacred trust of our university registrars. This standardization respects academic freedom while supporting learners in their pursuit of academic and professional opportunities, for example when transferring between institutions and seeking course credit for prior learning.
How does a university articulate a competency transcript from a peer institution? Where does Ernst & Young look for evidence of leadership, when each institution’s co-curricular information is reported in different sections, with no convention for describing the process by which activities were verified? How do various information systems import achievement data, when the field names and file formats lack any rhyme or reason? Before you know it, the best intentions and efforts give rise to documentation that isn’t widely understood, reliable or actionable.
We need to extend the transcript, but we need a method to do it within a well-worn convention that is backward compatible. By backward compatible I mean we need to preserve the role of the traditional academic transcript, and create a reasonable roadmap for extending it, when institutions so choose, in a way that serves students, educators, associations and employers.
This is why I am calling for a “PAR,” a Postsecondary Achievement Report. A PAR is a concise, electronic document that provides a standardized, machine-readable report of the full range of higher education experience. It can be verified by the academic registrar to confirm credibility, and it creates a common understanding of both course-based and campus-based achievements. A PAR does it sensibly, recognizing academic freedom. It is not a uniform way to grade; rather, it is a consistent document structure and data standard when institutions choose to extend their traditional academic transcripts. The PAR can be issued alongside a traditional transcript, or act as its next generation successor. It is a summative statement from the institution and a passport for the learner.
Perhaps the best model for a PAR is the Higher Education Achievement Report (HEAR), which has been evaluated for almost 10 years in Britain. The HEAR not only provides a standardized academic transcript; it also captures information relevant to employers. And the information is captured and transferred as electronic and verifiable data.
The HEAR is a maximum of six pages long and adheres to a standard template. It is verified by the academic registrar and regularly updated throughout a student’s enrollment. It is accessible by the student at any time, and is unique and personalized to them. There HEAR contains six sections:
Personal information about the student (name, date of birth, etc.)
Name and title of degree earned
Level of degree in the context of a defined, national framework
Detailed course information and results
Information about the degree and professional status (if applicable)
Additional awards and activities
In their final report, the HEAR’s creators summarize well the goal of their work: “The HEAR has been designed to encourage a sophisticated approach to recording achievement that better represents the full range of outcomes from learning and the student experience in higher education at the same time as encouraging personal development that is commensurate with a culture of lifelong learning.”
The HEAR is one example; there are various efforts internationally to create a more standardized way of reporting and documenting academic achievement. Australia has adopted something similar with the Australian Higher Education Graduation Statement (AHEGS). Our colleagues abroad also recognize the need to extend the transcript electronically, but do it in a way that is understood among all constituents nationally and internationally.
The U.S is the world leader and innovator in postsecondary education; we can take extended transcripting to the next level.
To succeed, we need to start with a core set of institutions, particularly those that are already doing competency-based and/or co-curricular transcripting, and have adopted eTranscripts. Those institutions can share their experience to establish a set of conventions that creates a common language for all. Institutions can create a roadmap for implementation, by adopting sections when ready. The beauty of a PAR is that it represents incremental change. At least some sections of PAR would be immediately actionable by any institution using eTranscripts. If a limited form PAR is as far as an institution is comfortable going at first, so be it. The pace of the roadmap will be driven, in part, by the validation PAR receivers give to more robust PAR issuers. In other words, if employers or grad school admissions committees start paying more attention to parts of the PAR, more institutions will add them. The more valuable a “complete” PAR is found to be, the more it will be demanded, and the broader and faster adoption will be.
Such an effort will require the collaboration of a number of campus leaders beyond registrars and admission officers. Chief academic officers, deans of continuing education and online programs, directors of student affairs and career services, as well as other campus leaders, will need to be engaged in the conversation both on their campuses and through their national organizations. And core organizational stakeholders like the American Association of Collegiate Registrars and Admissions Officers (AACRAO) and the P20W Education Standards Council (PESC) are central actors in helping to make the PAR a reality.
We are all ready for a new era of credentials communication, one that is aligned with our more mobile and digital culture. The PAR will be universally understood and actionable. It will be easily portable and stackable in an individual’s personal, online credential profile. Lifelong learners will start with a PAR, then continue to add digital academic or professional credentials from an ever-growing diversity of resources — from degrees to certificates to badges — to their profile, and present a verifiable, complete picture of education and skills.
In addition to knowledge and specific skills, a college experience imparts the ability to communicate a compelling story, to synthesize information into a bigger picture and to use data and numbers to understand a problem. Those are some of the characteristics that Google is looking for and that LinkedIn wants to help employers identify. In our knowledge economy, where opportunities are defined by what you know and how well you know it, a PAR will provide the foundation for learners, educators and employers to make more insightful and successful decisions.
Returning to the skills I encourage my students to demand in my opening lecture, for me the PAR is personal. I know from both my academic and professional experience how much they matter. In my last lecture — to the great surprise of my students — I reveal that before becoming an academic I was a technology entrepreneur. The skills I said they should demand are the reason my co-founders and I could create and build Blackboard..
We must ensure that the significant value gained during one’s postsecondary journey is captured and validated. A PAR would be a major step to empower learners, and help them turn credentials into opportunities.
Matthew Pittinsky is the CEO of Parchment and co-founder and former CEO of Blackboard. He is on the faculty of Arizona State University, and serves on the Board of Trustees of the Woodrow Wilson National Fellowship Foundation.
Whether we call it protesting, mudslinging, or “digital hate,” as Chancellor Phyllis Wise did in her blog post addressing University of Illinois’ Twitter incident, there is nothing new about very public, incendiary criticism occurring online — or in person. Racist and derogatory slurs and innuendos happen every day, in our college and university student centers, in our residence halls, out on the field at games. And numerous colleges and universities have felt the wrath of social media outrage in response to a decision, changes in leadership, and other developments.
As those of us in higher education know all too well, we lack the time, staff and resources to police our students on the Internet through disciplinary action. It’s simply not feasible or reasonable, nor is it conducive to free speech.
Our colleges and universities need to take a proactive stance and realize that digital identity development – something that thought leaders such as Eric Stoller have highlighted as part of the conversation defining student affairs and higher education – can and should be a part of our institutional curriculums. This is more than just a major in social media that focuses on marketing skills, or the occasional guest speaker at a student event. This goes beyond our coaches handing out guidelines to athletes.
This is student affairs and academic leadership making a commitment to offer educational outreach and resources to students campus-wide, ideally through first-year courses, so that all freshmen benefit. Colleges are increasingly offering classes that cover important topics like financial literacy, as part of their orientation classes for incoming students. What if more colleges and universities devoted some orientation class time to digital identity topics such as personal branding, where students were required to critically examine case studies of individuals (companies, politicians, actors, etc.) who suffered the consequences of doing something awful online? Such an exercise would surely help them realize their mistakes live on in infamy online. Knowing how to unplug and be present and in the moment is another area where first-year students would benefit from receiving ideas and resources to discuss and develop with one another. Basic digital literacy skills, such as knowing the professional benefits of writing emails so that they don’t come across as casual, flippant texts to friends, would be worth sharing in a first-year course experience for all incoming students.
Career services also has a part to play in providing regular, ongoing guidance and resources so students can market their ideas, potential and leadership online, not just their senior years, but right from the beginning, as part of their experiences in pursuing internships, degrees and ultimately, jobs. If you talk to your average college students, surprisingly, some of them think LinkedIn is something that their parents use, not something they should be tapping into to network and explore jobs and internship options. If career services counselors started working with them early on to develop LinkedIn profiles, imagine how much easier it might be for students to research great internships and connect with potential employers, alumni and mentors throughout their time in college.
The pressure is on for higher education to get with the program and be more relevant to what students need to become gainfully employed after college. How far into the future will these hateful tweets haunt University of Illinois students once they start looking for jobs? My guess is forever. How will these students, many of whom have grown up in a highly digitized world where communication is immediate and readily shared through numerous technologies, realize their potential as online ambassadors without some sort of educational outreach?
The other glaring part of the weird, uncertain, ever-changing journey of social media is that these problems — which range from online gaffes and faux pas to blatant racism and sexism – are not just limited to our students. Our faculty and staff are struggling with digital engagement and how to share their thoughts and ideas online in ways that don’t damage their reputations and that of our colleges and universities. There are plenty of examples of educators being reprimanded or even fired because of poor behavior on social media. Perhaps that’s why higher education has been slow to address the need for digital identity development. Many of us employed at our institutions are grappling with the best way to use social media, at a time when technology is transforming our industry. We’ve yet to really tap into a universal, comprehensive way to address this issue at most of our colleges and universities. To bring things full circle and make digital identity development fully integrated into higher education, we need to provide more training for faculty and staff, so they have a better understanding of why digital identity matters. That’s got to be part of the mix.
As Chancellor Wise wrote, “we still have work to do” in response to the University of Illinois incident. And that work must go beyond one-time disciplinary actions to address something larger, something that is fundamentally lacking at most of our institutions: providing digital identity development educational outreach and support to our campus communities, across the board.
Becca Ramspott is a communications specialist at Frostburg State University.
Recently The Atlantic predicted that one of the top five trends impacting higher education will be a push toward credit given for experience, proficiency and documented “competency.” The recent results of Inside Higher Ed’s survey of chief academic officers also show openness to competency-based outcomes.
For many, myself included, this simply sounds like a series of placement tests and seems like a pretty shallow approach to a college education and degree. However, as vice president of enrollment and chief marketing officer for a residential college, I can’t ignore the appeal of the “validation” of learning this trend suggests.
In fact, I find myself thinking more and more about how residential colleges, with their distinct missions, might respond to the potential threat this trend represents. I find myself hoping we can prove the residential environment results in valuable learning and life experiences beyond getting along with a roommate, asking someone on a date, learning how to tap a keg and configuring a renegade wireless network.
We can do more. Perhaps the idea of competency-based education should inspire us to think differently about how the learning environment of the residential experience is superior. Perhaps there are competencies associated with a residential college we’ve not done an adequate job of documenting?
This will not be easy for most of us. Our natural instinct to “wait and see how good our students turn out” to justify why students should live and learn on campus won’t work this time, as we face a skeptical public and witness more and more college presidents, administrators and boards reconsidering the value of online education. With some intentionality, we can do a much better job of proving why learning in a residential setting is superior.
We need to ask ourselves: Why is the residential campus experience of utmost importance to a contemporary undergraduate education? We must identify the sorts of learning that can only occur in such a setting, and validate, or better identify, the learning competencies that occur outside the classroom on a residential campus.
This will be difficult in an environment defined by shrinking resources, when many resort to thinking about eliminating activities considered not central to the core mission. The instinct is to cut, de-emphasize or keep separate and second. We see this time and time again in any setting that faces difficult choices about resources. But investment, integration and intentionality create a better path forward.
Can liberal arts colleges resist the urge to cut, and rethink how activities in the residential environment are central to the core mission? Can these colleges develop meaningful ways of measuring the value and impact of such activities and how they result in competencies that add value and worth? Can residential liberal arts colleges develop a “currency” that demonstrates they value out-of-classroom learning comparably to in-classroom learning? I hope so.
While many colleges would benefit from integrating out-of-classroom learning, residential liberal arts colleges must do so because of the infrastructure around which our colleges have been built -- residence and dining halls, student activity centers, athletic venues and performance halls. We need to prove these are not just modern amenities, but central to superior learning.
To validate this learning experience, residential liberal arts colleges will need to rethink historic barriers. Learning that occurs outside the classroom can no longer be viewed as “separate and second.”
Extra-curricular and co-curricular transcripts that fully document competencies and outcomes essential to success beyond college must evolve to be fully integrated with the academic program, and valued both internally and externally.
First, residential liberal arts colleges must clearly define the learning outcomes and expectations. This is frequently a faculty-driven exercise. Understanding the knowledge gained from an activity provides a framework around which out-of-classroom learning can be developed. This framework will allow for alignment of purpose and some measure of control about how central an out-of-classroom activity is to the core mission and which competencies are satisfied as a result.
Georgetown University was recently recognized for their excellent programming in the area of preparing student-athletes for leadership. Recognition of activities that successfully align with and even expand learning is critical for the public to be convinced that such activities are core to a high-quality education.
Next, residential liberal arts colleges must create a “currency” that meaningfully recognizes those activities that advance a student’s education, e.g., elective academic credit, a credit-bearing on-campus internship, or certificate for activities that demonstrate substantive interest and professional and personal development.
Student activities might be reorganized into mission-focused areas that provide students with experience not always fully represented in the academic program, but with relevance to a successful application for employment or graduate school.
Some examples might be: Leadership, Teamwork, Civic Engagement, Social Justice, Service Learning, Entrepreneurship and Business Development, Intercultural Understanding, Interfaith and Spiritual Development, Public Relations and Event Planning, and Sustainability. This approach is similar to competency-based certification, but broader than proof that a student can read a balance sheet or do a 10-minute presentation.
Finally, liberal arts colleges should engage in a broader conversation about why they are residential, without saying it’s because they’ve been that way for 150 years. Too many colleges assume students already understand. Such a learning environment can positively shape a student’s character and skillset, and result in sweeter success, but a residential community does not always acknowledge or articulate this success.
With competency-based education in the spotlight, residential colleges have an opportunity to renew a focus on the benefits to students who not only eat and sleep, but also meet colleagues, connect with mentors, challenge themselves in new ways, and develop 21st-century skills and competencies on campus.
If we do not champion and clearly identify the benefits to our students, we are vulnerable to the advocates of no-frills bachelor’s degrees, willy-nilly life experience for credit, online learning, and the commodifiers among us who believe the value of the college experience is test- and content-driven, rather than experiential and residential in nature.
W. Kent Barnds is executive vice president and vice president of enrollment, communication and planning at Augustana College, in Rock Island, Ill.
Report on two months of harassment of black student at San Jose State says that he didn't want to report the ugly incidents, and that university officials generally followed proper procedures. But president wasn't in the loop for weeks.
Submitted by Joe O'Shea on January 16, 2014 - 3:00am
Over the next few weeks, students around the country will receive offers of admission to colleges and universities. But before students jump online and accept an offer, I have one piece of advice for them: They might be better off not going to college next year.
Instead, they should think about taking a gap year, to defer college for a year to live and volunteer in a developing country.
In the traditional sort of gap year, students immerse themselves in a developing community to volunteer with a nonprofit organization by teaching, working with local youth, or assuming some other community role.
Gap years have been rising in popularity in the United States, the United Kingdom, Australia, and elsewhere. I’ve spent the last few years researching what happens to young people when they have such an immersive experience in a community radically different from their own.
The answer, in short, is that gap years can help change students in ways the world needs.
The challenges of our time demand an educational system that can help young people to become citizens of the world. We need our students to be smart, critical and innovative thinkers but also people of character who use their talents to help others. Gap years help young adults understand themselves, their relationships, and the world around them, which deepens capacities and perspectives crucial for effective citizenship. They help students become better thinkers and scholars, filled with passion, purpose, and perspective.
How do people learn from gap years?
One principal lesson is clear: We often develop most when our understandings of ourselves and the world around us are challenged -- when we engage with people and ideas that are different. Despite this insight, we often prioritize comfort and self-segregate into groups of sameness. We tend to surround ourselves with people who think, talk, and look similar to us.
Taking a gap year speeds our development by upsetting these patterns. Trying to occupy another's way of life in a different culture -- living with a new family, speaking the language, integrating into a community, perhaps working with local youth, for instance -- these are valuable experiences that help young people understand themselves, develop empathy and virtue, and expand their capacity to see the world from others' perspectives.
Traditionally, U.S. higher education has championed the idea of liberal arts as a way of getting students to engage with difference, to expand their worldview beyond their known universe by (in the words of a Harvard research committee on education) “questioning assumptions, by inducing self-reflection... by encounters with radically different historical moments and cultural formations.”
However, formal classroom education alone cannot accomplish this aim. The classroom is limited in its ability to engage students with difference and contribute to their development as able citizens. We also need new experiences that inspire critical self-reflection to cultivate the right moral feelings and dispositions.
What’s important here is the productive dissonance that these long-term, immersive gap year experiences provide. It's unlikely that a young person staying in America -- or even traveling overseas for a short time -- would have assumptions about herself and the world around her challenged with the same intensity, frequency, and breadth as in a gap year in a developing community.
It's interesting that spending time in developing communities can help young people appreciate ways of living that we need more of -- such as a more active and intimate sense of community. Going overseas also helps to cultivate a type of independence and self-confidence that staying close to home in a familiar environment probably does not.
Furthermore, taking the traditional kind of gap year after high school helps students to take full advantage of their time in college. One telling observation is that many students who take gap years end up changing their intended major after returning. During college, their gap year experiences enrich their courses, strengthen co-curricular endeavors, and animate undergraduate research and creative projects.
To be clear: Though these gap year students are working in partnership with a community organization and aim to make some positive impact, the students typically, at least in the short term, gain more than they are able to give. But this empowers them to bring new perspectives to bear in other personal, professional, and civic efforts. Gap years, borrowing a line from the Rhodes Scholarship Trust, can help create leaders for the world’s future.
Despite the benefits of these kinds of gap year experiences, too few Americans take gap years and too few colleges encourage them. The treadmill from high school to college makes it hard for students to see alternative paths. But that is changing. More people and organizations are beginning to see gap years for the formative experiences they can be, given with the proper training, support, and community work. In fact, all the Ivy League universities now endorse gap years for interested students. And they’re right to do so.
Many parents and students are nervous about the idea of spending an extended period in a developing country. But these experiences, especially through structured gap year programs like Global Citizen Year, are generally very safe and supported. Are there some risks? Of course, there are risks with any travel or change -- but the risks are worth taking. The investment in taking a gap year will pay dividends throughout one’s college career and beyond as one’s life and society is enriched.
However, one central challenge that remains is how to finance gap years for students from lower-income families. This is also beginning to change. The University of North Carolina and Princeton University, for instance, have both begun to subsidize gap years for incoming students. Other organizations, such as Omprakash, now offer low-cost volunteer placements as well as scholarships to those with need. And with the help of crowdfunding sites, students are able to fund-raise for these experiences with greater ease. Despite these efforts, if gap years are to really expand, we’ll need more institutions or governments to offset the costs.
Higher education is society’s last mass effort to really shape the character and trajectories of our young people. Let’s help them take more advantage of the precious time in college by taking a gap year before.
Joe O’Shea is director of Florida State University's Office of Undergraduate Research and author of Gap Year: How Delaying College Changes People in Ways the World Needs (Johns Hopkins University Press).