At a certain age, you find the slang of the day growing a bit opaque or slippery. Using it becomes a calculated risk. Not that the words or usages are necessarily incomprehensible, though some of them are. (The word “random” now has implications in the American vernacular that I have yet to figure out.) But the unwritten rules of informal correctness are sometimes tricky, and mastering them a challenge.
Simon Blackburn, for many years a professor of philosophy at Oxford, makes use of one recent idiom in his new book Mirror, Mirror: The Uses and Abuses of Self-Love (Princeton University Press) and nearly gets it right. Complaining about “the swarms of ‘selfies’ who infest places of interest, art galleries concerts, public spaces, and cyberspace,” he elaborates:
“For today’s selfie, the object of each moment is first to record oneself as having been there and second to broadcast the result to as much of the rest of the world as possible. The smartphone is the curse of public space as selfies click away with the lens pointed mainly at themselves and only secondarily at what is around them.”
And so on, in the same vein. ("You selfies get out of my yard!” as it were.) Reaching for the meaning of a recent piece of slang, Blackburn has turned in the right direction but not quite grasped it: “Selfie” refers to a photographic self-portrait, not to the person taking the picture. The author is editor of The Oxford Dictionary of Philosophy, so if there is a term for this kind of reversal -- confusing object and subjectivity -- he probably knows it.
But then he may not have fallen behind on the slang at all. “Selfie” really does sound like an apt term for the digital-age clones of Narcissus. Perhaps it will catch on? The phenomenon itself won’t be disappearing any time soon. The recent wildfires in California give us some idea of what an apocalyptic future would look like: people snapping selfies while everything around them is consumed in flames.
The appetite for self-documentation is not just a moral vice or a cultural symptom. The nuisance factor of the selfie is only the most blatant aspect of a tendency that Blackburn identifies as a fundamental problem for both serious thought and everyday life. “With a few exceptions,” he notes, "we can have just about any attitude toward ourselves that we have toward other people, or even to things in the world.”
Ordinary language is full of evidence for this point, since the range of expressions with “self-“ as a prefix is incredibly wide and practically uncountable: self-abasement, self-advancement, self-denial, self-respect, self-education, self-inspection, self-consciousness, self-murder…. For now that list should be sufficient, if not self-sufficient. "The exceptions,” Blackburn notes, "only include such trivial things as finding you in my way, which is possible, as opposed to finding myself in my way, which is arguably not, except metaphorically when perhaps it is all too possible.”
So the epistemological and moral problems raised by our relationship with other people or the world (where did they come from? what can we know about them? how should we treat them?) also apply in regard to the self, and are at least as complex, though probably more so. It seems the Oracle of Delphi cunningly hid a vast array of questions in her challenge to Socrates: Know thyself.
After 2,500 years, the mystery has only deepened. Writing in the 18th century, David Hume found the self to be elusive: “For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure…. When my perceptions are removed for any time, as by sound sleep; so long as I am insensible of myself, and may truly be said not to exist.”
Now we map the neural pathways of the brain in fine detail without ever locating the spot where the self picks up its mail. Even so, whole industries exist to provide goods and services to the self -- especially by selling it, if not power, then at least Band-Aids for wounded vanity. Mirror, Mirror quotes the apt definition of arrogance by Kant -- “an unjustified demand that others think little of themselves in comparison with us, a foolishness that acts contrary to its own end” -- that conveys how repulsive and counterproductive it is. And yet costly advertising campaigns are built around models who project, as Blackburn neatly expresses it, "the vacant euphoria” of absolute self-absorption: “even [their] blankness is a kind of disdain -- a refusal of human reciprocity.”
The ads embody the fantasy of an invulnerable self, inspiring awe but immune to envy, available for the price of hair-care products or new clothes. It’s a con, but a successful one, meaning that it leaves the mark poorer but no wiser. (Hence still vulnerable.)
Mirror, Mirror is not primarily a work of social or cultural criticism. At the same time, phenomena such as selfies or billboards filled with pouty sneering beautiful people are more than just irritants that provoke Blackburn into writing casual but learned essays that leave you wanting to read (or reread) the philosophical and literary works he draws on. That, too. But the author's shuttling between current trends and venerable texts seems enlivened by the occasional hint that his interest is, in part, personal.
That could be my imagination. Blackburn never waxes memoiristic; he uses the first person sparingly. Still, the book implies a quest, Socrates-like, for self-knowledge -- by no means to be confused with what Narcissus was after.
Department chairs report professor incompetence in their institutions when administrative checks to tenure process are lacking, and the process favors publishing quantity over quality, new study finds.
If we’re serious about improving outcomes for our students, we need to make sure the digital transition happens — and happens soon. As I wrote last year, "I’m not talking about a slight or even gradual increase in e-book adoptions... I’m talking about a total transition from a reliance on print textbooks to a full embrace of digital content and learning systems."
For the most part, I’ve been encouraged by the response to the article – from educators, from the industry, and from the hallways of my own company. Yes, I’ve been told a few times that we shouldn’t view technology as a panacea (I don’t), but by far the most common reaction I heard was, "Three years, sounds great!”
Then: "Too bad there’s no way we can pull it off."
Oh ye of little faith.
With 12 months down and 24 to go until my suggested “digital deadline,” let’s take a look at how much progress we’ve made, how far we still have to go, and what I think the next 12 months will hold for the industry on the journey to our digital future.
Why Digital? Why Now?
The reasons why we need to keep our foot on the gas as we move toward our digital future are clear: Our students aren’t graduating with the knowledge and skills they need to be successful, but they are leaving college with plenty of debt, and in many cases, no degree at all.
As a result, students are turning their backs on higher education. The New York Times recently reported that college enrollment fell 2 percent in 2012-13, and that in 2013-2014: “traditional four-year, nonprofit colleges [will] begin a contraction that will last for several years.” While I fully support the idea of different pathways to success, I don’t think that a major shift away from higher education is good for our students – or our country.
These challenges are complex, but they can be addressed – at least in part – by digital. In addition to improving access and affordability, digital can help instructors deliver the type of personalized learning experiences that have the potential to not only boost engagement but make real improvements in grades and graduation rates. If this type of technology exists, why aren’t we doing everything possible to bring it into every classroom in the country?
At McGraw-Hill Higher Education, we’ve seen usage of two of our biggest digital products, LearnSmart and Connect, increase year-over-year by 43 percent and 29 percent, respectively. We’ve also invested more than $130 million in digital R&D over the past year. And before you say, "You’re only making that investment because you expect a return," let me say that you’re exactly right. Our biggest chance to achieve success as a company is to help instructors and students achieve success. People pay for results, and digital can help drive those results. It’s that simple. And we’re not the only education-focused company making such investments or seeing increased interest in digital products.
"Are We There Yet?" A Year in Review – and a Look Ahead
Last year, I cited a number of trends that illustrate one simple concept: technology is becoming a bigger part of our students’ lives. These trends continue: A July report from Wakefield Research revealed that 99 percent of current students have at least one digital device and 68 percent use at least three devices each day.
But more interesting, I think, is the acceptance of big data in higher education as a positive, disruptive force. Not only have we seen more colleges take a data-driven approach to improving student outcomes, we’ve seen data capture the popular imagination. Just take a look at Nate Silver, the statistician who accounted for nearly 20 percent of the web traffic of The New York Times leading up to the 2012 presidential election.
Ed tech has made similar strides. In general, today's technology is more needs-focused, more thoroughly driven by data and research, and provides a user experience that, if not quite on par with what’s offered by the consumer tech world already, has narrowed the gap considerably.
We’ve also seen adaptive learning become a household term. Not only did the industry’s adaptive learning products become better thanks to system refinements and more student data, but we also saw adaptive technology reach into new areas of the learning experience, including e-books and virtual labs.
It’s also quickly gaining the confidence of educators. A 2013 Inside Higher Ed survey revealed that 66 percent of college and university presidents see the potential of adaptive learning to make a positive impact on higher education. And while presidents and faculty members don’t always see eye-to-eye on every use of technology, Inside Higher Ed’s survey of faculty members on technology found that 61 percent of instructors believe that adaptive learning has great potential to have a positive impact on higher education. With more data, more applications, better user experiences, and demonstrated efficacy, I think that greater usage of adaptive learning is the biggest lock of 2013-14.
And then, of course: MOOCs. It seems hard to believe now, but my first article one year ago made not one single mention of MOOCs. MOOCs have been a major story for the past year, for better and for worse. The future of MOOCs is still very much unwritten, but the important thing is that we saw a come-from-nowhere technology disrupt higher education, and instead of running away from it, many colleges decided to embrace it.
Over the next year, the hype around MOOCs may fade a bit, but their quality and credibility will increase. MOOCs shouldn't be faulted for not always living up to everyone's hopes in Year 1. Now, with the spotlight a little dimmer, we'll see them better-position students for success by integrating results-driving technologies like adaptive learning and ultimately become a more viable alternative for higher education.
Finally, we've seen institutions use technology to help rethink the very idea of how a higher education institution should operate. I love how Southern New Hampshire University is using technology to shift to a competency-based model, and I expect many more institutions to follow suit over the next year. We've only just started realizing how technology can impact not just the learning experience but the entire structure of the educational system. We might even see top students earning degrees in as little as a year. It’s amazing how the digital transformation can accelerate when colleges begin to think about technology as an organization.
The Road Ahead
When I think about what stands in the way of the shift to digital, I keep coming back to seven deadly words: "We’ve never done it that way before." It’s a type of thinking that is, unfortunately, still too common in education, and one that we must break away from in order to move forward. Because if we hold on to the past we must realize that we're holding onto something that's broken.
One thing, however, should not change, and that's the importance of instructors. There are some who see an inverse relationship between teaching and technology; who believe that adopting technology necessarily means marginalizing the role of the instructor. I just don't see this to be the case. Technology's goal is to help instructors provide more efficient, effective instruction. It's the means, not the end.
A few more things I think we’ll see over the next 12 months:
Major learning companies offering some print products only through a custom or "on demand" model. I can say for sure that this will be the case at McGraw-Hill Higher Education. And one day in the not-so-distant future, we won’t offer those print products at all.
New models for affordable, accredited education. MOOCs won't be the only game in town, as a slate of new players will find a way to deliver high quality, low cost (but not free) higher education that leads to a degree.
More colleges institutionalizing data collection and analysis. These capabilities can't be developed overnight, but in 2013-2014, we'll pass the tipping point of colleges and universities using data to drive what we at McGraw-Hill Education refer to as The Big 3: results, recruitment and retention.
The continued relevance of content. As Peter Kafka of AllThingsD recently tweeted: “Tech guy to content guy: You're screwed! Now, please help me build my business.” Even the best technology in the world still must be paired with trusted, proven content in order to be effective, and I think the future of our industry belongs to companies who can provide the best of both worlds.
Twenty-four months out from the digital deadline, our progress is good. As an industry, we have a clear understanding of the problems we face and how digital technology can help solve them, and there’s a general spirit of collaboration among colleges, learning companies and start-ups that is moving us, together, in the right direction. It’s inspiring, and it’s something I can’t say I’ve ever felt before.
They say that you never notice change happening and then one day it just hits you. Consider this your friendly 24-month warning.
Brian Kibby is president of McGraw-Hill Higher Education.