Whether we call it protesting, mudslinging, or “digital hate,” as Chancellor Phyllis Wise did in her blog post addressing University of Illinois’ Twitter incident, there is nothing new about very public, incendiary criticism occurring online — or in person. Racist and derogatory slurs and innuendos happen every day, in our college and university student centers, in our residence halls, out on the field at games. And numerous colleges and universities have felt the wrath of social media outrage in response to a decision, changes in leadership, and other developments.
As those of us in higher education know all too well, we lack the time, staff and resources to police our students on the Internet through disciplinary action. It’s simply not feasible or reasonable, nor is it conducive to free speech.
Our colleges and universities need to take a proactive stance and realize that digital identity development – something that thought leaders such as Eric Stoller have highlighted as part of the conversation defining student affairs and higher education – can and should be a part of our institutional curriculums. This is more than just a major in social media that focuses on marketing skills, or the occasional guest speaker at a student event. This goes beyond our coaches handing out guidelines to athletes.
This is student affairs and academic leadership making a commitment to offer educational outreach and resources to students campus-wide, ideally through first-year courses, so that all freshmen benefit. Colleges are increasingly offering classes that cover important topics like financial literacy, as part of their orientation classes for incoming students. What if more colleges and universities devoted some orientation class time to digital identity topics such as personal branding, where students were required to critically examine case studies of individuals (companies, politicians, actors, etc.) who suffered the consequences of doing something awful online? Such an exercise would surely help them realize their mistakes live on in infamy online. Knowing how to unplug and be present and in the moment is another area where first-year students would benefit from receiving ideas and resources to discuss and develop with one another. Basic digital literacy skills, such as knowing the professional benefits of writing emails so that they don’t come across as casual, flippant texts to friends, would be worth sharing in a first-year course experience for all incoming students.
Career services also has a part to play in providing regular, ongoing guidance and resources so students can market their ideas, potential and leadership online, not just their senior years, but right from the beginning, as part of their experiences in pursuing internships, degrees and ultimately, jobs. If you talk to your average college students, surprisingly, some of them think LinkedIn is something that their parents use, not something they should be tapping into to network and explore jobs and internship options. If career services counselors started working with them early on to develop LinkedIn profiles, imagine how much easier it might be for students to research great internships and connect with potential employers, alumni and mentors throughout their time in college.
The pressure is on for higher education to get with the program and be more relevant to what students need to become gainfully employed after college. How far into the future will these hateful tweets haunt University of Illinois students once they start looking for jobs? My guess is forever. How will these students, many of whom have grown up in a highly digitized world where communication is immediate and readily shared through numerous technologies, realize their potential as online ambassadors without some sort of educational outreach?
The other glaring part of the weird, uncertain, ever-changing journey of social media is that these problems — which range from online gaffes and faux pas to blatant racism and sexism – are not just limited to our students. Our faculty and staff are struggling with digital engagement and how to share their thoughts and ideas online in ways that don’t damage their reputations and that of our colleges and universities. There are plenty of examples of educators being reprimanded or even fired because of poor behavior on social media. Perhaps that’s why higher education has been slow to address the need for digital identity development. Many of us employed at our institutions are grappling with the best way to use social media, at a time when technology is transforming our industry. We’ve yet to really tap into a universal, comprehensive way to address this issue at most of our colleges and universities. To bring things full circle and make digital identity development fully integrated into higher education, we need to provide more training for faculty and staff, so they have a better understanding of why digital identity matters. That’s got to be part of the mix.
As Chancellor Wise wrote, “we still have work to do” in response to the University of Illinois incident. And that work must go beyond one-time disciplinary actions to address something larger, something that is fundamentally lacking at most of our institutions: providing digital identity development educational outreach and support to our campus communities, across the board.
Becca Ramspott is a communications specialist at Frostburg State University.
Recently The Atlantic predicted that one of the top five trends impacting higher education will be a push toward credit given for experience, proficiency and documented “competency.” The recent results of Inside Higher Ed’s survey of chief academic officers also show openness to competency-based outcomes.
For many, myself included, this simply sounds like a series of placement tests and seems like a pretty shallow approach to a college education and degree. However, as vice president of enrollment and chief marketing officer for a residential college, I can’t ignore the appeal of the “validation” of learning this trend suggests.
In fact, I find myself thinking more and more about how residential colleges, with their distinct missions, might respond to the potential threat this trend represents. I find myself hoping we can prove the residential environment results in valuable learning and life experiences beyond getting along with a roommate, asking someone on a date, learning how to tap a keg and configuring a renegade wireless network.
We can do more. Perhaps the idea of competency-based education should inspire us to think differently about how the learning environment of the residential experience is superior. Perhaps there are competencies associated with a residential college we’ve not done an adequate job of documenting?
This will not be easy for most of us. Our natural instinct to “wait and see how good our students turn out” to justify why students should live and learn on campus won’t work this time, as we face a skeptical public and witness more and more college presidents, administrators and boards reconsidering the value of online education. With some intentionality, we can do a much better job of proving why learning in a residential setting is superior.
We need to ask ourselves: Why is the residential campus experience of utmost importance to a contemporary undergraduate education? We must identify the sorts of learning that can only occur in such a setting, and validate, or better identify, the learning competencies that occur outside the classroom on a residential campus.
This will be difficult in an environment defined by shrinking resources, when many resort to thinking about eliminating activities considered not central to the core mission. The instinct is to cut, de-emphasize or keep separate and second. We see this time and time again in any setting that faces difficult choices about resources. But investment, integration and intentionality create a better path forward.
Can liberal arts colleges resist the urge to cut, and rethink how activities in the residential environment are central to the core mission? Can these colleges develop meaningful ways of measuring the value and impact of such activities and how they result in competencies that add value and worth? Can residential liberal arts colleges develop a “currency” that demonstrates they value out-of-classroom learning comparably to in-classroom learning? I hope so.
While many colleges would benefit from integrating out-of-classroom learning, residential liberal arts colleges must do so because of the infrastructure around which our colleges have been built -- residence and dining halls, student activity centers, athletic venues and performance halls. We need to prove these are not just modern amenities, but central to superior learning.
To validate this learning experience, residential liberal arts colleges will need to rethink historic barriers. Learning that occurs outside the classroom can no longer be viewed as “separate and second.”
Extra-curricular and co-curricular transcripts that fully document competencies and outcomes essential to success beyond college must evolve to be fully integrated with the academic program, and valued both internally and externally.
First, residential liberal arts colleges must clearly define the learning outcomes and expectations. This is frequently a faculty-driven exercise. Understanding the knowledge gained from an activity provides a framework around which out-of-classroom learning can be developed. This framework will allow for alignment of purpose and some measure of control about how central an out-of-classroom activity is to the core mission and which competencies are satisfied as a result.
Georgetown University was recently recognized for their excellent programming in the area of preparing student-athletes for leadership. Recognition of activities that successfully align with and even expand learning is critical for the public to be convinced that such activities are core to a high-quality education.
Next, residential liberal arts colleges must create a “currency” that meaningfully recognizes those activities that advance a student’s education, e.g., elective academic credit, a credit-bearing on-campus internship, or certificate for activities that demonstrate substantive interest and professional and personal development.
Student activities might be reorganized into mission-focused areas that provide students with experience not always fully represented in the academic program, but with relevance to a successful application for employment or graduate school.
Some examples might be: Leadership, Teamwork, Civic Engagement, Social Justice, Service Learning, Entrepreneurship and Business Development, Intercultural Understanding, Interfaith and Spiritual Development, Public Relations and Event Planning, and Sustainability. This approach is similar to competency-based certification, but broader than proof that a student can read a balance sheet or do a 10-minute presentation.
Finally, liberal arts colleges should engage in a broader conversation about why they are residential, without saying it’s because they’ve been that way for 150 years. Too many colleges assume students already understand. Such a learning environment can positively shape a student’s character and skillset, and result in sweeter success, but a residential community does not always acknowledge or articulate this success.
With competency-based education in the spotlight, residential colleges have an opportunity to renew a focus on the benefits to students who not only eat and sleep, but also meet colleagues, connect with mentors, challenge themselves in new ways, and develop 21st-century skills and competencies on campus.
If we do not champion and clearly identify the benefits to our students, we are vulnerable to the advocates of no-frills bachelor’s degrees, willy-nilly life experience for credit, online learning, and the commodifiers among us who believe the value of the college experience is test- and content-driven, rather than experiential and residential in nature.
W. Kent Barnds is executive vice president and vice president of enrollment, communication and planning at Augustana College, in Rock Island, Ill.
Report on two months of harassment of black student at San Jose State says that he didn't want to report the ugly incidents, and that university officials generally followed proper procedures. But president wasn't in the loop for weeks.
Submitted by Joe O'Shea on January 16, 2014 - 3:00am
Over the next few weeks, students around the country will receive offers of admission to colleges and universities. But before students jump online and accept an offer, I have one piece of advice for them: They might be better off not going to college next year.
Instead, they should think about taking a gap year, to defer college for a year to live and volunteer in a developing country.
In the traditional sort of gap year, students immerse themselves in a developing community to volunteer with a nonprofit organization by teaching, working with local youth, or assuming some other community role.
Gap years have been rising in popularity in the United States, the United Kingdom, Australia, and elsewhere. I’ve spent the last few years researching what happens to young people when they have such an immersive experience in a community radically different from their own.
The answer, in short, is that gap years can help change students in ways the world needs.
The challenges of our time demand an educational system that can help young people to become citizens of the world. We need our students to be smart, critical and innovative thinkers but also people of character who use their talents to help others. Gap years help young adults understand themselves, their relationships, and the world around them, which deepens capacities and perspectives crucial for effective citizenship. They help students become better thinkers and scholars, filled with passion, purpose, and perspective.
How do people learn from gap years?
One principal lesson is clear: We often develop most when our understandings of ourselves and the world around us are challenged -- when we engage with people and ideas that are different. Despite this insight, we often prioritize comfort and self-segregate into groups of sameness. We tend to surround ourselves with people who think, talk, and look similar to us.
Taking a gap year speeds our development by upsetting these patterns. Trying to occupy another's way of life in a different culture -- living with a new family, speaking the language, integrating into a community, perhaps working with local youth, for instance -- these are valuable experiences that help young people understand themselves, develop empathy and virtue, and expand their capacity to see the world from others' perspectives.
Traditionally, U.S. higher education has championed the idea of liberal arts as a way of getting students to engage with difference, to expand their worldview beyond their known universe by (in the words of a Harvard research committee on education) “questioning assumptions, by inducing self-reflection... by encounters with radically different historical moments and cultural formations.”
However, formal classroom education alone cannot accomplish this aim. The classroom is limited in its ability to engage students with difference and contribute to their development as able citizens. We also need new experiences that inspire critical self-reflection to cultivate the right moral feelings and dispositions.
What’s important here is the productive dissonance that these long-term, immersive gap year experiences provide. It's unlikely that a young person staying in America -- or even traveling overseas for a short time -- would have assumptions about herself and the world around her challenged with the same intensity, frequency, and breadth as in a gap year in a developing community.
It's interesting that spending time in developing communities can help young people appreciate ways of living that we need more of -- such as a more active and intimate sense of community. Going overseas also helps to cultivate a type of independence and self-confidence that staying close to home in a familiar environment probably does not.
Furthermore, taking the traditional kind of gap year after high school helps students to take full advantage of their time in college. One telling observation is that many students who take gap years end up changing their intended major after returning. During college, their gap year experiences enrich their courses, strengthen co-curricular endeavors, and animate undergraduate research and creative projects.
To be clear: Though these gap year students are working in partnership with a community organization and aim to make some positive impact, the students typically, at least in the short term, gain more than they are able to give. But this empowers them to bring new perspectives to bear in other personal, professional, and civic efforts. Gap years, borrowing a line from the Rhodes Scholarship Trust, can help create leaders for the world’s future.
Despite the benefits of these kinds of gap year experiences, too few Americans take gap years and too few colleges encourage them. The treadmill from high school to college makes it hard for students to see alternative paths. But that is changing. More people and organizations are beginning to see gap years for the formative experiences they can be, given with the proper training, support, and community work. In fact, all the Ivy League universities now endorse gap years for interested students. And they’re right to do so.
Many parents and students are nervous about the idea of spending an extended period in a developing country. But these experiences, especially through structured gap year programs like Global Citizen Year, are generally very safe and supported. Are there some risks? Of course, there are risks with any travel or change -- but the risks are worth taking. The investment in taking a gap year will pay dividends throughout one’s college career and beyond as one’s life and society is enriched.
However, one central challenge that remains is how to finance gap years for students from lower-income families. This is also beginning to change. The University of North Carolina and Princeton University, for instance, have both begun to subsidize gap years for incoming students. Other organizations, such as Omprakash, now offer low-cost volunteer placements as well as scholarships to those with need. And with the help of crowdfunding sites, students are able to fund-raise for these experiences with greater ease. Despite these efforts, if gap years are to really expand, we’ll need more institutions or governments to offset the costs.
Higher education is society’s last mass effort to really shape the character and trajectories of our young people. Let’s help them take more advantage of the precious time in college by taking a gap year before.
Joe O’Shea is director of Florida State University's Office of Undergraduate Research and author of Gap Year: How Delaying College Changes People in Ways the World Needs (Johns Hopkins University Press).