I, like many others, have been thinking a lot about Apple since the untimely death of its co-founder, Steve Jobs. One theme that has consistently emerged in reflections on Jobs’ legacy is the extent to which he taught so many of us how to navigate the information age. Jobs made his name by inventing a computer, but his most lasting impact will be the ways he gave consumers access to a range of devices that make the Internet indispensable.
Yet for all of his influence, Jobs’ vision for Apple is not the only model for how to manage the information age. Apple focuses on great hardware, but there is another approach that has been equally successful in getting people online. I am referring, of course, to Google. If Apple is hardware company, Google is a software company, one whose innovation and success resides in its ability create compelling internet – based software applications that help make the internet both manageable and productive.
I have been thinking about Apple vs. Google because story of these two companies helps us explore how higher education could navigate the new realities of the information age. The problem of higher education and the information age is an acute one. The basic structure of modern higher education has been in place, remarkably untouched, since the Middle Ages. It consists of three parts: campuses, including buildings for classrooms; professors, and libraries. Each part is equally important: a college is a place where a student goes to gain access to deposits of knowledge and expert professors, who will teach those students through lectures and other formalized ways. If the modern university has tweaked that model, it has only been to expand the idea of the campus to include student services, athletic facilities and so forth. Otherwise, the basic medieval structure remains largely intact.
This is a story that has been well told, but it is worth recalling here because life in the information age increasingly challenges the notion that the university must be a static place where students go to learn. It may even be that the current decentralization of social interaction will make the old structure of higher education obsolete, in which case the crisis of higher education has become very acute indeed.
So what possibilities for reform do Google and Apple offer? I begin with Apple, whose example suggests that traditional colleges can succeed by following Apple’s hardware-centered approach. Apple is a hardware company. Its primary purpose is to sell devices that provide the gateway to the information age: iPods, iPhones and iPads, along with more than a few computers. It has great software, but it uses software to enhance the hardware. Apple’s guiding principle is that the appliance--the device--is the key to unlocking the Internet, and its primary focus has been to produce and sell great Internet appliances.
One of Steve Jobs’ most firmly held principles is that the total user experience is what makes Apple’s devices so special, which is why Apple wants to control as much of the device’s “infrastructure” as possible in order to preserve that experience. At its best, an Apple device can at once draw you into using it and then, once you start using it, disappear so that you can focus on the task at hand. The reason people continue to be drawn to Apple products is that their total design stimulates the user’s creativity and productivity.
The infrastructure of a higher education institution in the Apple model should have the same effect. Entering an “Apple” campus should stimulate a student to want to learn, to engage in the life of the mind, to explore new ideas and avenues of inquiry. Design matters here for the same reasons Steve Jobs obsessed about design, because it is impossible to achieve this effect without intense planning, investment and attention to detail. Failure to attend to the totality of the student experience renders the campus experience less than immersive and, so, less than functional.
Likewise, hardware and software are intertwined in this model. The curriculum in an Apple - modeled university serves the same function as iOS (the software the drives Apple’s mobile devices) does in the iPhone: it is the means by which the student gets access to the total experience. This means that universities should take into account both hardware and software when designing that total experience. Questions such as how the physical location and facilities of the university shapes the curriculum, or how the curriculum shapes the physical location and infrastructure, are crucial.
In contrast to the Apple model’s emphasis on “hardware,” a Google approach suggests that there is also room in the educational spectrum for a software oriented model. Google is actually an advertising company, but it uses its innovative, easy-to-use software to get us on the Internet so that we will look at their ads. The tradeoff for the consumer is that Google’s products help users to access (through search) and manage (through Gmail and other applications) the Internet productively. Google’s products work as well as their desktop complements, but they are cloud-based, which means that they can exploit some of the features of the Internet (Google’s products excel at collaboration, for example), and that we can use them regardless of location and hardware configuration. I’ve written this essay on Google Apps using a seven-year-old laptop, an Internet terminal at the library, a smartphone during my daughter’s ballet practice, and a desktop computer. The result is an experience much different from being tethered to a single desk and hard drive.
A school in this Google model derives its identity from its faculty and curriculum, or its “software” while de-emphasizing the importance of its infrastructure, such as its classroom, library and other campus facilities. In other words, it is possible to provide a first-class education in a school without a full range of campus facilities (or maybe even a school without a traditional campus) as long as the curriculum gives students access to the right kind of critical thinking, formation and training. It used to be that to provide a first-class education required institutions to assemble all three components: faculty, library and classrooms. The Google model suggests that it is possible to re-conceive that structure entirely by shifting the focus to curriculum (and the necessary faculty to teach it) and then adapting whatever “hardware” is available to give the curriculum a platform.
The key to this model is the curriculum. There are a number of reasons why traditional higher education institutions have gotten away with fairly generic curricula (i.e., a series of courses taught in classrooms via lectures and discussion), but one of the most important is that the other components offset the inadequacies of curriculum. Stripping away the infrastructure exposes the curriculum and demands that it be effective and have integrity on its own. Stripping away the infrastructure, however, also frees the curriculum to provide new and dynamic ways of learning. If you have a classroom or library, you have to use it. If you do not have a classroom, then entirely new educational opportunities present themselves.
There have been curriculum-driven colleges for a long time, of course. St. John’s College in Annapolis, The Evergreen State College, Babson College and a number of others have occupied an important niche in American higher education precisely by offering innovative, non - traditional curricula. The Google model allows us to push this idea much further. Older generations of curriculum oriented schools have generally focused on very specific skills or objects of study, such as developing critical skills through an engagement with the Great Books. Modern educational technology, however, gives curriculum-based colleges access to new avenues of information and content delivery. So it is now possible to conceive of a school that uses an innovative curriculum for career preparation, perhaps by combining rigorous critical thinking with on-the-job training.
The Google model is not without significant risks. There is a real danger, for example, in trying to create efficiencies without recognizing what the Google model actually implies. Being curriculum centered frees both faculty and students to be innovative and distinctive without needing extensive infrastructure. What it does not do is force that college online or to do remote classrooms. To do this is simply to perpetuate the structures of a traditional university in a virtual environment, and it’s not clear that making a traditional university education more impersonal is conducive to good education. This is not a call to automate the classroom, nor is it an attempt to de-emphasize the centrality of trained, professional teachers and researchers.
In fact, my sense is that following the Google model will require more and better teachers because it will allow for a greater degree of personal contact between students and their teachers. Having access to reams of resources and information is a good thing, and all forms of higher education would do well to attend to those. But having access to information is not the same as being able to process that information, and finding skilled guides to sorting and generating new knowledge becomes essential to the future of higher education.
If it sounds like I am arguing for Google over Apple, that is not strictly true. Both models have the potential to offer compelling educational experiences that mirror how information age students actually think and learn. Part of my concern is that more and more higher educational institutions lack the resources or design savvy to pursue an infrastructure heavy model. The success of Google shows that there is another approach, one that authentically serves information age students while allowing universities to excel without having to produce the next iPad.
Mark Weedman is a professor of biblical studies at Crossroads College, in Minnesota.
Today's college graduates are entering a world where much of the most dynamic and important work will not be performed in a solitary office or around a physical conference table, but in virtual teams of individuals scattered around the globe. This mode of work, once confined to high-level scientific research, is already the norm in many fields, and it is accelerating quickly in almost every area of human endeavor.
Yet we do very little in our undergraduate curriculums to prepare students for this essential aspect of their future professional lives. Yes, we are incorporating technology into the classroom, and there are also an increasing number of higher education projects that encourage online cross-cultural dialogue. It is beyond obvious that our students live comfortably in the casual give-and-take of the online social media environment.
These interactions, however, lack an important element: the conception, the development and the completion of tangible intellectual products. Our students don’t need to learn how to communicate online -- they need to learn how to work together to get something done. It's producing good work together that is the key, ideally in a way where students each contribute both according to their individual talents and interests and also according to their physical and cultural situations.
The most straightforward way to encourage virtual collaboration is for students to conduct comparative primary research on topics of international importance. Under the guidance of a virtual team of faculty members, students can gather data, share it online, and work together to analyze their findings.
There are countless opportunities for faculty and students from many disciplines to put this into practice. Environmental studies students in Boston, Amsterdam, and Mumbai can gather and combine local data on sea levels and coastal erosion to understand better the effects of global climate change. Business students in San Francisco, Haifa, and Sao Paolo can create comparative case studies of successful entrepreneurship. Theater students in Colombo, Belgrade and New Orleans can compare the use of drama in addressing issues of racial or religious conflict.
In some cases, undergraduate virtual collaborations may produce surprising contributions to scholarly knowledge. In many cases, they may not. In some cases, of course, they will make different interpretations and come to different conclusions. Confronting intellectual disagreement is an important part of the process. And what is most important is that students develop tools that they will be using for many years to come.
Beyond undertaking parallel research projects, students in virtual teams have the opportunity, under the guidance of faculty members, to undertake more complex analyses of major global issues and problems. Our students will be working in environments where they will have to confront and integrate strikingly different perspectives into their ideas and plans for action. Virtual teamwork at the undergraduate level can deepen understanding and encourage them to begin this process early.
Online virtual collaboration involves at least four crucial skills. First, there is simply the skill of working well with others in a collaborative environment. Second, making efficient and effective use of technology to increase and disseminate knowledge. Third, working respectfully and productively across borders and cultural boundaries. Fourth, students who work in virtual teams will be pushed to develop new categories of thought and analysis, made possible through the direct interchange with peers. It is true that we address each of these skills to some extent in other ways, through online coursework and efforts to internationalize our campuses. But we seldom challenge our students to put these skills to use in the service of the heart of their work.
Creating opportunities for product-driven virtual teamwork may sound simpler in theory than it is in practice. The technological tools are readily available, but the development process requires considerable faculty time.
For colleges that send significant numbers of students abroad each year, one model might be to engage those students, scattered in sites around the world, on projects that draw on the very different curricula they are studying and environments in which they are living. The advantage of this model is that faculty members have the opportunity to work face-to-face with the groups both before and after their virtual team experience.
Another model involves establishing strong relationships between faculty members in similar fields at overseas institutions. In this model, students in virtual teams may never have met one another in person, but work together via technology under the direction of the participating faculty. It is likely, however, that this model will work best when faculty members themselves have had substantial opportunities to talk and compare notes in person. After all, faculty members must must work together to develop a framework for student research so that these online interactions are genuinely productive, and not merely a gimmick.
For this reason, virtual teamwork will only become widespread at the undergraduate level if it is strongly supported, giving faculty members the time and the incentive to develop these modules, and in some cases travel resources to establish face-to-face connections that can then be built on in the online environment.
Virtual teamwork cannot and should not simply replace individual intellectual endeavor. But it is a vital component of the production and dissemination of knowledge in the professional world – including the world of faculty research. If we don’t give our students the chance to practice, our curriculums will be needlessly divorced from one of the most dynamic trends of our time.
There has been much talk of the “online revolution” in higher education. While there is a place for online education, some of its boosters anticipate displacing the traditional campus altogether. A close reading of their arguments, however, makes clear that many share what might be called the “individualist fallacy,” both in their understanding of how students learn and how professors teach.
Of course, individualism has a long, noble heritage in American history. From the “age of the self-made man” onward, we have valued those who pull themselves up by their own bootstraps. But, as Warren Buffett has made clear, even the most successful individuals depend heavily on the cultural, economic, legal, political, and social contexts in which they act. This is as true for Buffett as it is for other so-called self-made men as Bill Gates. And it is certainly true for students.
But many advocates of online learning ignore this simple point. The economist Richard Vedder, for example, believes that being on campus is only useful for “making friends, partying, drinking, and having sex.” Anya Kamenetz, in her book DIY U, celebrates the day when individuals are liberated from the constraints of physical campuses, while Gates anticipates that “five years from now on the Web for free you’ll be able to find the best lectures in the world. It will be better than any single university.”
For an alternative view
on online education, read this essay appearing
elsewhere on the site today.
These advocates of online higher education forget the importance of institutional culture in shaping how people learn. College is about more than accessing information; it’s about developing an attitude toward knowledge.
There is a difference between being on a campus with other students and teachers committed to learning and sitting at home. Learning, like religion, is a social experience. Context matters. No matter how much we might learn about God and our obligations from the Web, it is by going to church and being surrounded by other congregants engaged in similar questions, under the guidance of a thoughtful, caring pastor, that we really change. Conversion is social, and so is learning.
Like all adults, students will pursue many activities during their time on campus, but what distinguishes a college is that it embodies ideals distinct from the rest of students’ lives. If we take college seriously, we need people to spend time in such places so that they will leave different than when they entered.
Some argue that large lecture courses make a mockery of the above claims. Admittedly, in a better world, there would be no large lecture courses. Still, this argument misleads for several reasons. First, it generalizes from one kind of course, ignoring the smaller class sizes at community colleges and the upper-division courses in which students interact closely with each other and their professors. Second, it dismisses the energy of being in a classroom, even a large one, with real people when compared to being on our own. Even in large classes, good teachers push their students to think by asking probing questions, modeling curiosity, and adapting to the class’s needs. Finally, it disregards the importance of the broader campus context in which all classes, large and small, take place.
The goal of bringing students to campus for several years is to immerse them in an environment in which learning is the highest value, something online environments, no matter how interactive, cannot simulate. Real learning is hard; it requires students to trust each other and their teachers. In other words, it depends on relationships. This is particularly important for the liberal arts.
Of course, as Richard Arum and Josipa Roksa’s recent study Academically Adrift makes clear, there are great variations in what college students are learning. All too often, higher education does not fulfill our aspirations. But none of the problems Arum and Roksa identify are ones that online higher education would solve. As Arum and Roksa make clear, students learn more on campuses where learning is valued and expectations are high. If anything, we need to pay more attention to institutional culture because it matters so much.
This does not mean that we should reject technology when it can further learning, as in new computer programs that help diagnose students’ specific stumbling blocks. But computers will never replace the inspiring, often unexpected, conversations that happen among students and between students and teachers on campuses. Because computers are not interpretive moral beings, they cannot evaluate assignments in which students are asked to reflect on complicated ideas or come up with new ones, especially concerning moral questions. Fundamentally, computers cannot cultivate curiosity because machines are not curious.
Technology is a tool, not an end in itself. As the computer scientist Jaron Lanier has written in his book You Are Not A Gadget, computers exist to support human endeavors, not the other way around. Many techno-utopists proclaim that computers are becoming smarter, more human, but Lanier wonders whether that is because we tend to reduce our human horizons to interact with our machines. This certainly is one of the dangers of online higher education.
The individualist fallacy applies not just to online advocates’ understandings of students, but also their conception of what makes great teachers and scholars. Vedder, for example, echoes Gates in his hope that someday there will be a Wikipedia University, or that the Gates Foundation will start a university in which a few “star professors” are paid to teach thousands of students across the nation and world. Of course, this has been happening since the invention of cassette tapes that offer “the great courses.” This is hardly innovative, nor does it a college education make.
Vedder ignores how star professors become great. How do they know what to teach and to write? Their success, like Buffett’s, is social: they converse with and read and rely on the work of hundreds, even thousands, of other scholars. Read their articles and books, listen to their lectures, and you can discern how deeply influenced and how dependent they are on the work of their peers. In short, there would be no star professors absent an academy of scholars committed to research.
Schools like the online, Gates Foundation-funded Western Governors University free-ride off the expensive, quality research completed by traditional professors when they rely on open course ware and curricula. Take away the professors, and many online schools will teach material that is out of date or inaccurate or, worse, hand control over to other entities who are not interested in promoting the truth -- from textbook companies seeking to maximize sales to coal and pharmaceutical companies offering their own curriculums for “free.”
The Web and new technologies are great tools; they have made more information more accessible to more people. This is to be celebrated. Citizens in a democracy should be able to access as much information as freely as possible. A democratic society cannot allow scholars, or anyone else, to be the gatekeepers to knowledge.
Certainly, we will expand online higher education, if for no other reason than because wealthy foundations like Gates and ambitious for-profit entities are putting their money and power behind it. For certain students, especially working adults pursuing clearly defined vocational programs rather than a liberal arts education, online programs may allow opportunities that they would have otherwise foregone. But online higher education will never replace, much less replicate, what happens on college campuses.
Even as we expand online, therefore, we must deepen our commitment to those institutions that cultivate a love of learning in their students, focus on the liberal arts, and produce the knowledge that online and offline teaching requires.